• About

Sacred Cow Chips

Sacred Cow Chips

Tag Archives: Scientism

Conformity and Suppression: How Science Is Not “Done”

26 Thursday Jan 2023

Posted by Nuetzel in Political Bias, Science

≈ Leave a comment

Tags

Breakthrough Findings, Citation Politics, Citation Practices, Climate science, Conformist Science, Covid Lockdowns, Disruptive Science, Mary Worley Montagu, Matt Ridley, NASA, Nature Magazine, Politicized Science, President Dwight Eisenhower, Public Health, Scientism, Scott Sumner, Steven F. Hayward, Wokeness

I’m not terribly surprised to learn that scientific advancement has slowed over my lifetime. A recent study published in the journal Nature documented a secular decline in the frequency of “disruptive” or “breakthrough” scientific research across a range of fields. Research has become increasingly dominated by “incremental” findings, according to the authors. The graphic below tells a pretty dramatic story:

The index values used in the chart range “from 1 for the most disruptive to -1 for the least disruptive.” The methodology used to assign these values, which summarize academic papers as well as patents, produces a few oddities. Why, for example, does the tech revolution of the last 40 years create barely a blip in the technology index in the chart above? And why have tech research and social science research always been more “disruptive” than other fields of study?

Putting those questions aside, the Nature paper finds trends that are basically consistent across all fields. Apparently, systematic forces have led to declines in these measures of breakthrough scientific findings. The authors try to provide a few explanations as to the forces at play: fewer researchers, incrementalism, and a growing role of large-team research that induces conformity. But if research has become more incremental, that’s more accurately described as a manifestation of the disease, rather than a cause.

Conformity

Steven F. Hayward skewers the authors a little, and perhaps unfairly, stating a concern held by many skeptics of current scientific practices. Hayward says the paper:

“… avoids the most significant and obvious explanation with the myopia of Inspector Clouseau, which is the deadly confluence of ideology and the increasingly narrow conformism of academic specialties.”

Conformism in science is nothing new, and it has often interfered with the advancement of knowledge. The earliest cases of suppression of controversial science were motivated by religious doctrine, but challenges to almost any scientific “consensus” seem to be looked upon as heresy. Several early cases of suppression are discussed here. Matt Ridley has described the case of Mary Worley Montagu, who visited Ottoman Turkey in the early 1700s and witnessed the application of puss from smallpox blisters to small scratches on the skin of healthy subjects. The mild illness this induced led to immunity, but the British medical establishment ridiculed her. A similar fate was suffered by a Boston physician in 1721. Ridley says:

“Conformity is the enemy of scientific progress, which depends on disagreement and challenge. Science is the belief in the ignorance of experts, as [the physicist Richard] Feynman put it.”

When was the Scientific Boom?

I couldn’t agree more with Hayward and Ridley on the damaging effects of conformity. But what gave rise to our recent slide into scientific conformity, and when did it begin? The Nature study on disruptive science used data on papers and patents starting in 1945. The peak year for disruptive science within the data set was … 1945, but the index values were relatively high over the first two decades of the data set. Maybe those decades were very special for science, with a variety of applications and high-profile accomplishments that have gone unmatched since. As Scott Sumner says in an otherwise unrelated post, in many ways we’ve failed to live up to our own expectations:

“In retrospect, the 1950s seem like a pivotal decade. The Boeing 707, nuclear power plants, satellites orbiting Earth, glass walled skyscrapers, etc., all seemed radically different from the world of the 1890s. In contrast, airliners of the 2020s look roughly like the 707, we seem even less able to build nuclear power plants than in the 1960s, we seem to have a harder time getting back to the moon than going the first time, and we still build boring glass walled skyscrapers.”

It’s difficult to put the initial levels of the “disruptiveness” indices into historical context. We don’t know whether science was even more disruptive prior to 1945, or how the indices used by the authors of the Nature article would have captured it. And it’s impossible to say whether there is some “normal” level of disruptive research. Is a “normal” index value equal to zero, which we now approach as an asymptote?

Some incredible scientific breakthroughs occurred decades before 1945, to take Einstein’s theory of relativity as an obvious example. Perhaps the index value for physical sciences would have been much higher at that time, were it measured. Whether the immediate post-World War II era represented an all-time high in scientific disruption is anyone’s guess. Presumably, the world is always coming from a more primitive base of knowledge. Discoveries, however, usually lead to new and deeper questions. The authors of the Nature article acknowledge and attempt to test for the “burden” of a growing knowledge base on the productivity of subsequent research and find no effect. Nevertheless, it’s possible that the declining pattern after 1945 represents a natural decay following major “paradigm shifts” in the early twentieth century.

The Psychosis Now Known As “Wokeness”

The Nature study used papers and patents only through 2010. Therefore, the decline in disruptive science predates the revolution in “wokeness” we’ve seen over the past decade. But “wokeness” amounts to a radicalization of various doctrines that have been knocking around for years. The rise of social justice activism, critical theory, and anthropomorphic global warming theology all began long before the turn of the century and had far reaching effects that extended to the sciences. The recency of “wokeness” certainly doesn’t invalidate Hayward and Ridley when they note that ideology has a negative impact on research productivity. It’s likely, however, that some fields of study are relatively immune to the effects of politicization, such as the physical sciences. Surely other fields are more vulnerable, like the social sciences.

Citations: Not What They Used To Be?

There are other possible causes of the decline in disruptive science as measured by the Nature study, though the authors believe they’ve tested and found these explanations lacking. It’s possible that an increase in collaborative work led to a change in citation practices. For example, this study found that while self-citation has remained stable, citation of those within an author’s “collaboration network” has declined over time. Another paper identified a trend toward citing review articles in Ecology Journals rather than the research upon which those reviews were based, resulting in incorrect attribution of ideas and findings. That would directly reduce the measured “disruptiveness” of a given paper, but it’s not clear whether that trend extends to other fields.

Believe it or not, “citation politics” is a thing! It reflects the extent to which a researcher should suck-up to prominent authors in a field of study, or to anyone else who might be deemed potentially helpful or harmful. In a development that speaks volumes about trends in research productivity, authors are now urged to append a “Citation Diversity Statement” to their papers. Here’s an academic piece addressing the subject of “gendered citation practices” in contemporary physics. The 11 authors of this paper would do well to spend more time thinking about problems in physics than in obsessing about whether their world is “unfair”.

Science and the State

None of those other explanations are to disavow my strong feeling that science has been politicized and that it is harming our progress toward a better world. In fact, it usually leads us astray. Perhaps the most egregious example of politicized conformism today is climate science, though the health sciences went headlong toward a distinctly unhealthy conformism during the pandemic (and see this for a dark laugh).

Politicized science leads to both conformism and suppression. Here are several channels through which politicization might create these perverse tendencies and reduce research productivity or disruptiveness:

  • Political or agenda-driven research is driven by subjective criteria, rather than objective inquiry and even-handed empiricism
  • Research funding via private or public grants is often contingent upon whether the research can be expected to support the objectives of the funding NGOs, agencies, or regulators. The gravy train is reserved for those who support the “correct” scientific narrative
  • Promotion or tenure decisions may be sensitive to the political implications of research
  • Government agencies have been known to block access to databases funded by taxpayers when a scientist wishes to investigate the “wrong questions”
  • Journals and referees have political biases that may influence the acceptance of research submissions, which in turn influences the research itself
  • The favorability of coverage by a politicized media influences researchers, who are sensitive to the damage the media can do to one’s reputation
  • The influence of government agencies on media treatment of scientific discussion has proven to be a potent force
  • The chance that one’s research might have a public policy impact is heavily influenced by politics
  • The talent sought and/or attracted to various fields may be diminished by the primacy of political considerations. Indoctrinated young activists generally aren’t the material from which objective scientists are made

Conclusion

In fairness, there is a great deal of wonderful science being conducted these days, despite the claims appearing in the Nature piece and the politicized corruption undermining good science in certain fields. Tremendous breakthroughs are taking place in areas of medical research such as cancer immunotherapy and diabetes treatment. Fusion energy is inching closer to a reality. Space research is moving forward at a tremendous pace in both the public and private spheres, despite NASA’s clumsiness.

I’m sure there are several causes for the 70-year decline in scientific “disruptiveness” measured in the article in Nature. Part of that decline might have been a natural consequence of coming off an early twentieth-century burst of scientific breakthroughs. There might be other clues related to changes in citation practices. However, politicization has become a huge burden on scientific progress over the past decade. The most awful consequences of this trend include a huge misallocation of resources from industrial planning predicated on politicized science, and a meaningful loss of lives owing to the blind acceptance of draconian health policies during the Covid pandemic. When guided by the state or politics, what passes for science is often no better than scientism. There are, however, even in climate science and public health disciplines, many great scientists who continue to test and challenge the orthodoxy. We need more of them!

I leave you with a few words from President Dwight Eisenhower’s Farewell Address in 1961, in which he foresaw issues related to the federal funding of scientific research:

“Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.”

Interventionists Love You and Demand You Change, or Else

19 Friday Aug 2022

Posted by Nuetzel in Central Planning, Industrial Policy, Uncategorized

≈ Leave a comment

Tags

CHIPS Act, David McGrogan, Dierdre McCloskey, Don Boudreaux, Industrial Planning, Inflation Reduction Act, Jason Brennan, Joseph Stiglitz, Lionel Trilling, Lockdowns, Pandemic, Paul Krugman, Scientism, Solyndra

Statistics and measurement might not be critical to the exercise of the authoritarian impulse, but they have served to enable the technocratic tyranny idealized by contemporary statists. Certain influential thinkers have claimed our ability to compile statistics helps give rise to the bureaucratized state. I ran across a great post that led with that topic: “The Brutalization of Compassion” by David McGrogan. The mere ability to compile relevant statistics on a population and its well being (income, jobs, wages, inequality, mortality, suicide, etc… ) can motivate action by authorities to “improve” matters. The purpose might be to get ahead of rival states, or the action might be rationalized as compassion. But watch out! McGrogan quotes a bit of cautionary wisdom from Lionel Trilling:

“‘When once we have made our fellow men the objects of our enlightened interest,’ he put it, something within us causes us to then ‘go on and make them the objects of our pity, then of our wisdom, ultimately of our coercion.’”

Ultimately, to pursue their vision, interventionists must impose controls on behaviors. In practice, that means any variance or attempted variance must be penalized. Here’s McGrogan’s description of the steps in this process:

“The conceptualisation of the population as a field of action, and the measurement of statistical phenomenon within it – the taking of an ‘enlightened interest’ in it – gives rise to both ‘pity,’ or compassion, and the application of ‘wisdom’ to resolve its problems. What is left, of course, is coercion, and we do not need to look far to identify it in the many means by which the modern state subjects the population to a kind of Tocquevillian ‘soft despotism,’ constantly manipulating, cajoling and maneuvering it this way and that for its own good, whether through compulsory state education or ‘sin taxes’ or anything in between.”

Follow the Scientism

I can’t neglect to mention another important condition: the hubris among apparatchiks who imagine the state can improve upon private institutions to achieve social betterment. They will always fail in attempts to replace the action of the private markets and the price mechanism to process information relating to scarcities and preferences. Absent that facility, human planners cannot guide flows of resources to their most valued uses. In fact, they nearly always botch it!

Government provision of public goods is one concession worth making, but the state capacity needed to fulfill this legitimate function is subject to severe mission creep: we frequently see efforts to characterize goods and service as “public” despite benefits that are almost wholly private (e.g. education). Likewise, we often hear exaggerated claims of “harms” requiring state intervention (e.g. carbon emissions). These situations often hinge purely on politics. Even when legitimate external benefits or costs can be identified, there is a pretension that they can be accurately measured and corrected via subsidies or taxes. This is far-fetched. At best, it’s possible to vouch for the directional appropriateness of some interventions, but the magnitude of corrective measures is variable and essentially unknowable. Too often we see government failure via over-subsidization of politically favored activities and over-penalization of politically disfavored activities.

One of the most egregious errors of intervention is the over-application of the precautionary principle: if risks are associated with an activity, then it must be curtailed. This often relies on measurements of highly uncertain causes and effects, and it involves aggregation subject to its own biases.

Just as questionable is the ability of “experts” to model natural or behavioral processes such that outcomes can be “predicted” over horizons extending many decades forward. That interventionists tend to ignore the uncertainties of these predictions is the most blatant and damaging conceit of all, not least because the public and the media usually have limited knowledge with which to assess the phenomenon in question.

Public Health Tyranny

The Covid pandemic presented a compelling excuse for precautionists in government and even private institutions to impose radical controls under a set of claims they called “the science”. These claims were often false and really antithetical to the principles of scientific inquiry, which calls for continually questioning hypotheses, even when they represent “consensus”. Yet a series of questionable scientific claims were used to justify abridgment of basic freedoms for the general population, most of whom faced little risk from the virus. This included lockdowns of schools and churches, business closures, cancellation of public events (except of course for protests and riots by Leftists), deferred medical care, vaccine mandates, and mask mandates. The damage these measures inflicted was fierce, and in the end we know that it was almost entirely unnecessary. Still, the public health establishment seems all too willing to ignore the facts in its readiness to repeat the whole range of mistakes at the slightest uptick in what’s now an endemic infection.

Standard Issue Cronyism

In the wake of the pandemic, we’ve witnessed a surge in calls for government to enhance the security of our nation’s supply chains. Too large a share of the critical goods required by domestic industries are produced overseas, which has made supply disruptions, and the threat of future disruptions, especially acute. Right on cue, advocates of industrial policy and planning have arranged for the federal government to provide $85 billion to domestic producers of semiconductors under the so-called CHIPS Act. But semiconductor producers are in no need of government incentives to “re-shore” production:

“… there has been even more chipmaking investment dedicated to the U.S. market, even as federal subsidies have languished. Construction is now underway at four major U.S. facilities and will continue with or without subsidies—something even Intel reluctantly acknowledged when it delayed the groundbreaking ceremony on its much‐ballyhooed Ohio facility to protest congressional inaction. This is because, as numerous experts have explained over the last year, there are real economic and geopolitical reasons to invest in additional U.S. semiconductor production—no federal subsidies needed.”

Moreover, the global shortage of computer chips appears to be ending. The subsidies will unnecessarily enrich industrialists and their shareholders, provide a source of graft to bureaucrats and various middle men, and likely over-allocate resources to domestic production of chips. Industrial planning of this kind has a long history of failure, and this time won’t be different.

Climate Fascists

We also see repeated over-application of the precautionary principle and rising dominance of industrial policy in climate and energy policy. Enormous sacrifices are imposed on consumers for the sake of minuscule changes in global carbon emissions and the “expected” long-term path of future “global” temperatures. The interventions taken in pursuit of these objectives are draconian, limiting choices and raising the cost of virtually everything produced and consumed. They distort the direction of physical investment, disfavoring reliable sources of base load capacity needed for growth, and also disfavoring the safest and most reliable zero-carbon alternative: nuclear power. The renewable energy sources foolishly pushed by the state and the ESG establishment are environmentally costly in their own right, and they don’t work when natural conditions are unfavorable. As one wag says about the climate provisions of the ironically named Inflation Reduction Act, “Gonna be a lot more Solyndras coming”.

And talk about sloppy! Our “trusted representatives” in Congress could hardly be bothered to pretend they’d done their homework. They neglected to provide any quantitative carbon and temperature impacts of the legislation. This must be a case of true honesty, because they really have no idea!

Delusions of Central Planning

One great weakness (among many) of arguments for state industrial planning is the assumption that government agents are somehow more competent, efficient, and “pure of heart” than agents in the private sector. Nothing could be more laughable. On this point, some of the most incisive commentary I’ve seen is provided by the masterful Don Boudreaux, first quoting Georgetown philosopher Jason Brennan before adding his own entertaining thoughts:

The typical way the left argues for the state is to describe what economists in the 1850s thought markets would be like under monopoly or monopsony, and then compare that to a state run by angels. Both halves of the argument are bad, and yet philosophy treats this as if it were rigorous and sophisticated.

“Far too many policy proposals are nothing more than prayers to the state-god. ‘We entreat you, Oh Powerful and Sacred One, to relieve our people of this or that misery, blemish, and market imperfection! We beseech you to bestow upon us – your faithful servants – cosmic justice, safety from new pathogens, unkind thoughts, and microaggressions, and protection from each and every burden of reality that we can imagine being cured by an omniscient, benevolent, and omnipotent deity! If we obey – and sacrifice to you without complaint our treasure and our freedoms – you will provide!’

I do not exaggerate. Pick at random any proposed government intervention offered by the likes of Progressives or national conservatives, and you’ll discover that the workability of this proposed intervention, when evaluated honestly, rests on nothing more solid than the above absurd faith that the state is – or, when in the right hands, will be – a secular god.”

On the idealization of government’s ability to “plan the economy” rationally, here is more from Boudreaux, first quoting the great Deirdre McCloskey:

Deep in left-wing thought about the economy, and in a good deal of right-wing thought, too, is the premise, as Isaiah Berlin once put it with a sneer, that government can accomplish whatever it rationally proposes to do. As has been often observed about leftists even as sweet as John Rawls, the left has no theory of the behavior of the government. It assumes that the government is a perfect expression of the will of The People.

“And nothing is more unscientific – indeed, more mystical – than is this still-commonplace practice of most Progressives, and also of very many conservatives, to analyze the economy and society, and to offer policy recommendations, using such a juvenile ‘understanding’ of the state. Yet such an ‘understanding’ of the state permeates the work even of some Nobel laureates in economics – laureates such as Paul Krugman and Joseph Stiglitz. This ‘understanding’ of the state is inseparable also from the work of pundits too many to count…

That these professors and pundits think of themselves as scientific – and are widely regarded as being especially intelligent, thoughtful, and scientific – testifies to the strength of the cult of democratically rubber-stamped coercion.”

Conclusion

Humans have proven to be incredible documentarians. The advent of measurement techniques and increasingly sophisticated methods of accounting for various phenomena has enabled better ways of understanding our world and our well being. Unfortunately, a by-product was the birth of scientism, the belief that men in authority are capable not only of measuring, but of fine-tuning, the present and future details of society and social interaction. Those pretensions are terribly mistaken. However, the actions of Congress and the Biden Administration prove that it’s adherents will never be persuaded, despite repeated demonstrations of the futility of central planning. Their words of compassion are no comfort — they must coerce the ones they “love”.

Price Controls: Political Gut Reaction, Gut Punch To Public

06 Thursday Jan 2022

Posted by Nuetzel in Price Controls, Shortage

≈ 1 Comment

Tags

Artificial Tradeoffs, Big Meat, Big Oil, Black Markets, central planning, Excess Demand, Federal Reserve, Inflation, Isabella Weber, Joe Biden, Money Supply, Paul Krugman, Price Controls, Relative Prices, Scientism, Shortage, Unintended Consequences

In a gross failure of education or perhaps memory, politicians, policymakers, and certain academics seem blithely ignorant of things we’ve learned repeatedly. And of all the dumb ideas floated regarding our current bout with inflation, the notion of invoking price controls is near the top. But watch out, because the Biden Administration has already shifted from “inflation is transitory” to “it only hurts the rich” to “it’s fine because people just want to buy things”, and now “greedy businessmen are the culprits”. The latter falsehood is indeed the rationale for price controls put forward by a very confused economist at the University of Massachusetts-Amherst named Isabella Weber. (See this for an excerpt and a few immediate reactions.) She makes me grieve for my profession… even the frequently ditzy Paul Krugman called her out, though he softened his words after realizing he might have offended some of his partisan allies. Of course, the idea of price controls is just bad enough to gain favor with the lefty goofballs pulling Biden’s strings.

To understand the inflation process, it’s helpful to distinguish between two different dynamics:

1. When prices change we usually look for explanations in supply and demand conditions. We have supply constraints across a range of markets at the moment. There’s also a great deal to say about the ways in which government policy is hampering supplies of labor and energy, which are key inputs for just about everything. It’s fair to note here that, rather than price controls, we just might do better to ask government to get out of the way! In addition, however, consumer demand rebounded as the pandemic waned and waxed, and the federal government has been spending hand over fist, with generous distributions of cash with no strings attached. Thus, supply shortfalls and strong demand have combined to create price pressures across many markets.

2. Economy-wide, all dollar prices cannot rise continuously without an excess supply of a monetary asset. The Federal Reserve has discussed tapering its bond purchases in 2022 and its intention to raise overnight interest rates starting in the spring. It’s about time! The U.S. money supply ballooned during 2020 and its growth remains at a gallop. This has enabled the inflation we are experiencing today, and only recently have the markets begun to react as if the Fed means business.

Weber, our would-be price controller, exhibits a marked ignorance with respect to both aspects of price pressure: how markets work in the first instance, and how monetary profligacy lies at the root of broader inflation. Instead, she insists that prices are rising today because industrialists have simply decided to extract more profit! Poof! It’s as simple as that! Well what was holding those greedy bastards back all this time?

Everyone competes for scarce resources, so prices are bid upward when supplies are short, inputs more costly, or demand is outpacing supply for other reasons. Sure, sellers may earn a greater margin on sales under these circumstances. But the higher price accomplishes two important social objectives: efficient rationing of available quantities, and greater incentives to bring additional supplies to market.

So consider the outcome when government takes the advice of a Weber: producers are prohibited from adjusting price in response to excess demand. Shortages develop. Consumers might want more, but that’s either impossible or it simply costs more. Yet producers are prohibited from pricing commensurate with that cost. Other adjustments soon follow, such as changes in discounts, seller credit arrangements, and product quality. Furthermore, absent price adjustment, transaction costs become much more significant. Other resources are consumed in the mere process of allocating available quantities: time spent in queues, administering quotas, lotteries or other schemes, costly barter, and ultimately unsatisfied needs and wants, not to mention lots of anger and frustration. Lest anyone think this process is “fair”, keep in mind that it’s natural for these allocations to take a character that is worse than arbitrary. “Important people” will always have an advantage under these circumstances.

Regulatory and financial burdens are imposed on those who play by the rules, but not everyone does. Black market mechanisms come into play, including opportunities for illegal side payments, rewards for underworld activity, along with a general degradation in the rule of law.

Price controls also impose rigidity in relative prices that can be very costly for society. “Freezing” the value of one good in terms of others distorts the signals upon which efficient resource allocation depends. Tastes, circumstances, and production technology change, and flexible relative prices enable a smoother transitions between these states. And even while demand and/or input scarcity might increase in all markets, these dynamics are never uniform. Over time, imbalances always become much larger in some markets than others. Frozen relative prices allow these imbalances to persist.

For example, the true value of good A at the imposition of price controls might be two units of good B. Over time, the true value of A might grow to four units of good B, but the government insists that A must be traded for no more than the original two units of B. Good B thus becomes overvalued on account of government intervention. The market for good A, which should attract disproportionate investment and jobs, will instead languish under a freeze of relative prices. Good B will continue to absorb resources under the artificial tradeoff imposed by price controls. Society must then sacrifice the gains otherwise afforded by market dynamism.

The history of price controls is dismal (also see here). They artificially suppress measured inflation and impose great efficiency costs on the public. Meanwhile, price controls fail to address the underlying monetary excess.

Price controls are destructive when applied economy-wide, but also when governments attempt to apply them to markets selectively. Posturing about “strategic” use of price controls reveals the naïveté of those who believe government planners can resolve market dislocations better than market participants themselves. Indeed, the planners would do better to discover, and undo, the damage caused by so many ongoing regulatory interventions.

So beware Joe Biden’s bluster about “greedy producers” in certain markets, whether they be in “Big Meat”, or “Big Oil”. Price interventions in these markets are sure to bring you less meat, less oil, and quite possibly less of everything else. The unintended consequences of such government interventions aren’t difficult to foresee unless one is blinded with the scientism of central planning.

Hyperbolic Scenarios, Crude Climate Models, and Scientism

07 Sunday Nov 2021

Posted by Nuetzel in Climate science, Global Warming

≈ 6 Comments

Tags

Carbon Efficiency, Carbon forcing, carbon Sensitivity, Cloud Feedback, COP26, G20, Global Temprature, IEA, Intergovernmental Panel on Climate Change, International Energy Agency, IPCC, Joe Biden, Joe Brandon, Judith Curry, Justin Ritchie, Net Zero Emissions, Nic Lewis, Precautionary Principle, Prince Charles, RCP8.5, rent seeking, Representative Concentration Pathway, Roger Pielke Jr., Scientism, United Nations

What we hear regarding the dangers of climate change is based on predictions of future atmospheric carbon concentrations and corresponding predictions of global temperatures. Those predictions are not “data” in the normal, positive sense. They do not represent “the way things are” or “the way things have been”, though one might hope the initial model conditions align with reality. Nor can the predictions be relied upon as “the way things will be”. Climate scientists normally report a range of outcomes produced by models, yet we usually hear only one type of consequence for humanity: catastrophe!

Models Are Not Reality

The kinds of climate models quoted by activists and by the UN’s Intergovernmental Panel on Climate Change (IPCC) have been around for decades. Known as “carbon forcing” models, they are highly simplified representations of the process determining global temperatures. The primary forecast inputs are atmospheric carbon concentrations over time, which again are themselves predictions.

It’s usually asserted that climate model outputs should guide policy, but we must ask: how much confidence can we have in the predictions to allow government to take coercive actions having immediate, negative impacts on human well being? What evidence can be marshaled to show prospective outcomes under proposed policies? And how well do these models fit the actual, historical data? That is, how well do model predictions track our historical experience, given the historical paths of inputs like carbon concentrations?

Faulty Inputs

The IPCC has been defining and updating sets of carbon scenarios since 1990. The scenarios outline the future paths of greenhouse gas emissions (and carbon forcings). They were originally based on economic and demographic modeling before an apparent “decision by committee” to maintain consistency with scenarios issued in the past. Roger Pielke Jr. and Justin Ritchie describe the evolution of this decision process, and they call for change:

“Our research (and that of several colleagues) indicates that the scenarios of greenhouse gas (GHG) emissions through the end of the twenty-first century are grounded in outdated portrayals of the recent past. Because climate models depend on these scenarios to project the future behavior of the climate, the outdated scenarios provide a misleading basis both for developing a scientific evidence base and for informing climate policy discussions. The continuing misuse of scenarios in climate research has become pervasive and consequential—so much so that we view it as one of the most significant failures of scientific integrity in the twenty-first century thus far. We need a course correction.”

One would certainly expect the predicted growth of atmospheric carbon to evolve over time. However, as Pielke and Ritchie note, the IPCC’s baseline carbon scenario today, known as RCP8.5 (“Representative Concentration Pathway”), is remarkably similar to the “business as usual” (BAU) scenario it first issued in 1990:

“The emissions scenarios the climate community is now using as baselines for climate models depend on portrayals of the present that are no longer true. And once the scenarios lost touch with reality, so did the climate, impact, and economic models that depend on them for their projections of the future. Yet these projections are a central part of the scientific basis upon which climate policymakers are now developing, debating, and adopting policies.”

The authors go on to discuss a few characteristics of the BAU scenario that today seem implausible, including:

“… RCP8.5 foresees carbon dioxide emissions growing rapidly to at least the year 2300 when Earth reaches more than 2,000 ppm of atmospheric carbon dioxide concentrations. But again, according to the IEA and other groups, fossil energy emissions have likely plateaued, and it is plausible to achieve net-zero emissions before the end of the century, if not much sooner.”

Pielke and Ritchie demonstrate that the IPCC’s baseline range of carbon emissions by 2045 is centered well above (actually double) the mid-range of scenarios developed by the International Energy Agency (IEA), and there is very little overlap between the two. However, global carbon emissions have been flat over the past decade. Even if we extrapolate the growth in atmospheric CO2 parts per million over the past 20 years, it would rise to less than 600 ppm by 2100, not 1,200 ppm. It’s true that a few countries (China comes to mind) continue to exploit less “carbon efficient” energy resources like coal, but the growth trend in concentrations is likely to continue to taper over time.

It therefore appears that the IPCC’s climate scenarios, which are used broadly as model inputs by the climate research community, are suspect. As one might suspect: garbage in, garbage out. But what about the climate models themselves?

Faulty Models

The model temperature predictions have been grossly in error. They have been and continue to be “too hot”. The chart at the top of this post is typical of the comparisons of model projections and actual temperatures. Before the year 2000, most of the temperature paths projected by the particular model charted above ran higher than actual temperatures. However, the trends subsequently diverged and the gap has become more extreme over the past two decades.

The problem is not merely one of faulty inputs. The models themselves are deeply flawed, as they fail to account adequately for natural forces that strongly influence our climate. It’s been clear for many years that the sun’s radiative energy has a massive impact on temperatures, and it is affected not only by the intensity of the solar cycle but also by cloud cover on Earth. Unfortunately, carbon forcing models do not agree on the role that increased clouds might have in amplifying warming. However, a reduction in cloud cover over the past 20 years, and a corresponding increase in radiative heat, can account for every bit of the warming experienced over that time.

This finding not only offers an alternative explanation for two decades of modest warming, it also strikes at the very heart of the presumed feedback mechanism usually assumed to amplify carbon-induced warming. The overall effect is summarized by the so-called carbon sensitivity, measured as the response of global temperature to a doubling of carbon concentration. The IPCC puts that sensitivity in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, as are those found by Frank Bosse reported here. The uncertainties surrounding the role of cloud cover and carbon sensitivities reveal that the outputs relied upon by climate alarmists are extreme model simulations, not the kind of reliable intelligence upon which drastic policy measures should be taken.

The constant anxiety issued from the Left on the issue of climate change, and not a little haranguing of the rest of us, is misplaced. The IPCC’s scenarios for the future paths of carbon concentration are outdated and seriously exaggerated, and they represent a breach of scientific protocol. Yet the scenarios are widely used as the basis of policy discussions at both the domestic and international levels. The climate models themselves embed questionable assumptions that create a bias toward calamitous outcomes.

Yet Drastic Action Is Urged

The UN’s 2021 climate conference, or COP26 (“Conference of the Parties …”) is taking place in Glasgow, Scotland this month. Like earlier international climate conferences, the hope is that dire forecasts will prompt broad agreement on goals and commitments, and that signatory countries will translate these into policy at the national level.

Things got off to a bad start when, before COP26 even began, the G20 nations failed to agree on a goal of “net-zero” carbon emissions by 2050. Another bad portent for the conference is that China and India, both big carbon emitters, will not attend, which must be tremendously disappointing to attendees. After all, COP26 has been billed by Prince Charles himself as “the last chance saloon, literally”, for saving the world from catastrophe. He said roughly the same thing before the Paris conference in 2014. And Joe Brandon … er, Biden, blurted some hyperbole of his own:

“Climate change is already ravaging the world. … It’s destroying people’s lives and livelihoods and doing it every single day. … It’s costing our nations trillions of dollars.”

All this is unadulterated hogwash. But it is the stuff upon which a crisis-hungry media feeds. This hucksterism is but one form of climate rent-seeking. Other forms are much more troubling: scary scenarios and model predictions serve the self-interest of regulators, grant-seeking researchers, interventionist politicians, and green investors who suckle at the public teat. It is a nightmare of scientism fed by the arrogance of self-interested social planners. The renewable energy technologies promoted by these investors, politicians, and planners are costly and land-intensive, providing only intermittent output (requiring backup fossil fuel capacity), and they have nasty environmental consequences of their own.

The precautionary principle is no excuse for the extreme policies advocated by alarmists. We already have economically viable “carbon efficient” and even zero-carbon energy alternatives, such as natural gas, modular nuclear power, and expanded opportunities for exploiting geothermal energy. This argues against premature deployment of wasteful renewables. The real crisis is the threat posed by the imposition of draconian green policies to our long-term prosperity, and especially to the world’s poor.

Portents of Harris-Biden Nation

22 Thursday Oct 2020

Posted by Nuetzel in Politics

≈ Leave a comment

Tags

#MeToo, Anthony Weiner, Antifa, Barack Obama, Black Lives Matter, Court Packing, Critical Race Theory, Donald Trump, Green New Deal, Harvey Weinstein, Hunter Biden, Jeffrey Toobin, Joe Biden, Kamala Harris, Lockdowns, Marxism, Nancy Pelosi, Public Health, Scientism

Joe Biden is a weak figurehead, a one-time moderate faltering over a coalition of leftists. If you wonder why Nancy Pelosi floated legislation to establish a committee on “presidential capacity,” don’t think so much about her loathing for Donald Trump; think about poor Joe Biden. He might be shunted aside just as soon as the power grab isn’t too obvious. They know well how Barack Obama famously said, “Don’t underestimate Joe’s ability to f*ck things up.” But whether Joe Biden is in control of anything, think about who he stands with:

The Violent Left: Marxist Antifa and Marxist BLM; opposed to law and order; burning cities; spewing eliminationist rhetoric; hissing n*g**r at black cops;

Police Defunders: won’t acknowledge good policing is needed more than ever, especially in minority communities;

“Ministers of Truth”: social media platforms exerting control over what we say and what we see;

Re-Educators: democrats push for a “Truth and Reconciliation Commission” to address the “issue” of Trump supporters;

Critical Race Theorists: a Marxist front whereby every word and action is viewed in the context of racial bias and victimization; they want reparations; on your knees.

The Scientistic: who labor under the delusion that “science” should guide all administrative and political decisions. Or someone’s version of science. The very idea is antithetical to the scientific domain, which deals only with falsifiable hypotheses. Few matters of value can be addressed using the tools of science exclusively, nor can they address matters of ethics.

Fear Mongers: would rule by precaution; risks are always worth exaggerating to existential proportions;

Lockdown Tyrants: refuse to acknowledge the steep public health costs of lockdowns; stripping individual liberties indefinitely, including the right to contract, free practice of religion, and assembly;

Insurrectionists: who fabricated a Russian collusion hoax to subvert the 2016 election, and later to overthrow a sitting president;

Gun Confiscators: they will if we let them;

Abortionists: would use federal tax dollars to fund the murder of millions of babies late into pregnancy, primarily black babies;

Fluid-Genderists: insist that children should be encouraged to explore transgenderism;

Taxers: won’t stop with punitive taxes on the wealthy and employers; it’s just not easy to milk high earners in a way that’s sufficient to pay for the fiscal debauchery demanded by the Biden-Harris constituency. Joe says he will raise taxes by $3.4 trillion.

Spenders: $2 trillion of new federal education outlays, including universal pre-K and free community college; the Green New Deal (see below). After all, the democrats are the party that can’t tell the difference between a cut in spending and a reduction in spending growth. If you think Trump is a big spender, their plans are astonishing;

Green New Dealers: would spend trillions to restrict energy choices, transfer U.S. wealth overseas in the name of international carbon reduction, and reduce our standard of living;

Redistributionists: would tax job creators not simply for the benefit of supporting the needy, but for anyone regardless of need (see UBI); this extends to plans to bail out blue states and cities with insolvent public employee pension funds;

Interventionists: would regulate all phases of life, including straws, sugary drinks, and your fireplace; they will burden private initiative; create artificial, politically-favored winners skilled at manipulating regulatory rules for competitive reasons; and create losers who are typically too small to handle the burden;

Medical Socialists: will strip your private health insurance, dictate the care you may receive, fix prices, and regulate physicians and other providers. You’ll love the care abroad, if you can afford to get out when your sick.

Public School Monopolists: poorly performing, beholden to teachers’ unions, unresponsive to taxpayers and often parents; they would happily revoke school choice;

Federal Suburb Rezoners: demanding low-income housing in every community;

Court Packers: to destroy the independent judiciary;

Iran Apologists: give them cash on the tarmac, let them develop their “peaceful” nuclear program; alienate the rest of the Middle East;

Grifters: marketing their influence as public servants for private gain; never exclusive to one side of the aisle, but the Biden family has certainly traded on Joe to enrich themselves;

Smear Merchants: fabricated allegations against Brett Kavanaugh; impugned Amy Coney Barrett’s religious faith;

Perverts: Harvey Weinstein, Anthony Weiner, Jeffrey Toobin, Hunter Biden, and Bill Clinton, to name just a few; even Joe has his #MeToo accusers;

I could go on and on, but Harris-Biden voters should get a strong taste of their compatriots from the list above. It reflects the overriding prescriptive, bullying, and sometimes violent nature of the Left. They’d have you think all material goods can be free. Presto! They presume to have the knowledge and wisdom to plan the economy and your life better than you, Better than free markets and free people. What they’ll need is a lot of magic, or it won’t go well. You’ll get poverty and tears. I’m not sure Joe has the desire or the wherewithal to rein in his coalition of idiots.

Central Planning With AI Will Still Suck

23 Sunday Feb 2020

Posted by Nuetzel in Artificial Intelligence, Central Planning, Free markets

≈ Leave a comment

Tags

Artificial Intelligence, central planning, Common Law, Data Science, Digital Socialism, Friedrich Hayek, Jesús Fernández-Villaverde, Machine Learning, Marginal Revolution, Property Rights, Robert Lucas, Roman Law, Scientism, The Invisible Hand, The Knowledge Problem, The Lucas Critique, Tyler Cowen

 

Artificial intelligence (AI) or machine learning (ML) will never make central economic planning a successful reality. Jesús Fernández-Villaverde of the University of Pennsylvania has written a strong disavowal of AI’s promise in central planning, and on the general difficulty of using ML to design social and economic policies. His paper, “Simple Rules for a Complex World with Artificial Intelligence“, was linked last week by Tyler Cowen at Marginal Revolution. Note that the author isn’t saying “digital socialism” won’t be attempted. Judging by the attention it’s getting, and given the widespread acceptance of the scientism of central planning, there is no question that future efforts to collectivize will involve “data science” to one degree or another. But Fernández-Villaverde, who is otherwise an expert and proponent of ML in certain applications, is simply saying it won’t work as a curative for the failings of central economic planning — that the “simple rules” of the market will aways produce superior social outcomes.

The connection between central planning and socialism should be obvious. Central planning implies control over the use of resources, and therefore ownership by a central authority, whether or not certain rents are paid as a buy-off to the erstwhile owners of those resources. By “digital socialism”, Fernández-Villaverde means the use of ML to perform the complex tasks of central planning. The hope among its cheerleaders is that adaptive algorithms can discern the optimal allocation of resources within some “big data” representation of resource availability and demands, and that this is possible on an ongoing, dynamic basis.

Fernández-Villaverde makes the case against this fantasy on three fronts or barriers to the use of AI in policy applications: data requirements; the endogeneity of expectations and behavior; and the knowledge problem.

The Data Problem: ML requires large data sets to do anything. And impossibly large data sets are required for ML to perform the task of planning economic activity, even for a small portion of the economy. Today, those data sets do not exist except in certain lines of business. Can they exist more generally, capturing the details of all economic transactions? Can the data remain current? Only at great expense, and ML must be trained to recognize whether data should be discarded as it becomes stale over time due to shifting demographics, tastes, technologies, and other changes in the social and physical environment. 

Policy Change Often Makes the Past Irrelevant: Planning algorithms are subject to the so-called Lucas Critique, a well known principle in macroeconomics named after Nobel Prize winner Robert Lucas. The idea is that policy decisions based on observed behavior will change expectations, prompting responses that differ from the earlier observations under the former policy regime. A classic case involves the historical tradeoff between inflation and unemployment. Can this tradeoff be exploited by policy? That is, can unemployment be reduced by a policy that increases the rate of inflation (by printing money at a faster rate)? In this case, the Lucas Critique is that once agents expect a higher rate of inflation, they are unlikely to confuse higher prices with a more profitable business environment, so higher employment will not be sustained. If ML is used to “plan” certain outcomes desired by some authority, based on past relationships and transactions, the Lucas Critique implies that things are unlikely to go as planned.  

The Knowledge Problem: Not only are impossibly large data sets required for economic planning with ML, as noted above. To achieve the success of markets in satisfying unlimited wants given scarce resources, the required information is impossible to collect or even to know. This is what Friedrich Hayek called the “knowledge problem”. Just imagine the difficulty of arranging a data feed on the shifting preferences of many different individuals across a huge number of products,  services and they way preference orderings will change across the range of possible prices. The data must have immediacy, not simply a historical record. Add to this the required information on shifting supplies and opportunity costs of resources needed to produce those things. And the detailed technological relationships between production inputs and outputs, including time requirements, and the dynamics of investment in future productive capacity. And don’t forget to consider the variety of risks agents face, their degree of risk aversion, and the ways in which risks can be mitigated or hedged. Many of these things are simply unknowable to a central authority. The information is hopelessly dispersed. The task of collecting even the knowable pieces is massive beyond comprehension.

The market system, however, is able to process all of this information in real time, the knowable and the unknowable, in ways that balance preferences with the true scarcity of resources. No one actor or authority need know it all. It is the invisible hand. Among many other things, it ensures the deployment of ML only where it makes economic sense. Here is Fernández-Villaverde:

“The only reliable method we have found to aggregate those preferences, abilities, and efforts is the market because it aligns, through the price system, incentives with information revelation. The method is not perfect, and the outcomes that come from it are often unsatisfactory. Nevertheless, like democracy, all the other alternatives, including ‘digital socialism,’ are worse.”

Later, he says:

“… markets work when we implement simple rules, such as first possession, voluntary exchange, and pacta sunt servanda. This result is not a surprise. We did not come up with these simple rules thanks to an enlightened legislator (or nowadays, a blue-ribbon committee of academics ‘with a plan’). … The simple rules were the product of an evolutionary process. Roman law, the Common law, and Lex mercatoria were bodies of norms that appeared over centuries thanks to the decisions of thousands and thousands of agents.” 

These simple rules represent good private governance. Beyond reputational enforcement, the rules require only trust in the system of property rights and a private or public judicial authority. Successfully replacing private arrangements in favor of a central plan, however intricately calculated via ML, will remain a pipe dream. At best, it would suspend many economic relationships in amber, foregoing the rational adjustments private agents would make as conditions change. And ultimately, the relationships and activities that planning would sanction would be shaped by political whim. It’s a monstrous thing to contemplate — both fruitless and authoritarian.

You’re Welcome: Charitable Gifts Prompt Statist Ire

14 Friday Dec 2018

Posted by Nuetzel in Central Planning, Charity, Uncategorized

≈ 1 Comment

Tags

Amazon, American Institute for Economic Research, central planning, Charity, Cloe Anagnos, Day 1 Fund, Doug Bandow, Forced Charity, Gaby Del Valle, Homelessness, Jeff Bezos, Redistribution, Russ Roberts, Scientism, Seattle Employment Tax, War on Charity

Charitable acts are sometimes motivated by a desire to cultivate a favorable reputation, or even to project intelligence. Perhaps certain charitable acts are motivated by guilt of one kind or another. Tax deduction are nice, too. But sometimes a charitable gift is prompted by no more than a desire to help others less fortunate. It’s likely a combination of motives in many cases, but to gainsay the purity of anyone’s charitable motives is rather unseemly. Yet Gaby Del Valle does just that in Vox, casting a skeptical eye at Jeff Bezos’ efforts to help the homeless through his Day 1 Fund.

“Last week, Amazon founder and CEO Jeff Bezos announced that he and his wife, MacKenzie Bezos, were donating $97.5 million to 24 organizations that provide homeless services across the country. The donation is part of Bezos’s $2 billion ‘Day 1 Fund, a philanthropic endeavor … that, according to Bezos, focuses on establishing ‘a network of new, non-profit, tier-one preschools in low-income communities’ and funding existing nonprofits that provide homeless services.”

Del Valle says Bezos deserves little credit for his big gift for several reasons. First, Amazon very publicly opposed a recent initiative for a $275 per employee tax on large employers in Seattle. The proceeds would have been used to fund public programs for the homeless. This allegation suggests that Bezos feels guilty, or that the gift is a cynical attempt to buy-off critics. That might have an element of truth, but the tax was well worthy of opposition on economic grounds — almost as if it was designed to stunt employment and economic growth in the city.

Second, because Amazon has been an engine of growth for Seattle, Del Valle intimates that the company and other large employers are responsible for the city’s high cost of housing and therefore homelessness. Of course, growth in a region’s economy is likely to lead to higher housing prices if the supply of housing does not keep pace, but forsaking economic growth is not a solution. Furthermore, every large city in the country suffers from some degree of homelessness. And not all of those homeless individuals have been “displaced”, as Del Valle would have it. Some have relocated voluntarily without any guarantee or even desire for employment. As for the housing stock, government environmental regulations, zoning policies and rent control (in some markets) restrains expansion, leading to higher costs.

Finally, Del Valle implies that private efforts to help the homeless are somehow inferior to “leadership by elected officials”. Further, she seems to regard these charitable acts as threatening to “public” objectives and government control. At least she doesn’t disguise her authoritarian impulses. Del Valle also quotes a vague allegation that one of the charities beholden to Amazon is less than a paragon of charitable virtue. Well, I have heard similar allegations that government isn’t celebrated for rectitude in fulfilling its duties. Like all statists, Del Valle imagines that government technocrats possess the best vision of how to design aid programs. That attitude is an extension of the scientism and delusions of efficacy typical of central planners. Anyone with the slightest awareness of the government’s poor track record in low-income housing would approach such a question with trepidation. In contrast, private efforts often serve as laboratories in which to test innovative programs that can later be adopted on a broader scale.

While selfishness might motivate private acts of charity in some cases, only voluntary, private charity can ever qualify as real charity. Government benefits for the homeless are funded by taxes, which are compulsory. Such public programs might be justifiable as an extension of social insurance, but it is not charity in any pure sense; neither are it advocates engaged in promoting real charity, despite their conveniently moralistic positioning. And unlike private charity, government redistribution programs can be restrained only through a political process in which substantial payers are a distinct minority of the voting population.

Public aid and private charity have worked alongside each other for many years in the U.S. According to Russ Roberts, private giving to the poor began to be “crowded-out” during the Great Depression by a dramatic increase in public assistance programs. (Also see Doug Bandow’s “War On Charity“.) It’s certainly more difficult to make a case for gifts to the poor when donors are taxed by the government in order to redistribute income.

The statist war on private charity can take other forms. The regulatory apparatus can crowd-out private efforts to extend a helping hand. Chloe Anagnos of the American Institute for Economic Research (AIER) writes of a charity in Kansas City that wanted to provide home-cooked soup to the homeless, but health officials intervened, pouring bleach into the soup. I am aware of similar but less drastic actions in St. Louis, where organizations attempting to hand-out sandwiches to the poor were recently prohibited by health authorities.

Private charity has drawn criticism because its source has driven economic growth, its source has opposed policies that stunt comic growth, and because it might interfere with the remote possibility that government would do it better. But private charity plays a critical role in meeting the needs of the disadvantaged, whether as a substitute for public aid where it falls short, or as a supplement. It can also play a productive role in identifying the most effective designs for aid programs. Of course, there are corrupt organizations and individuals purporting to do charitable work, which argues for a degree of public supervision over private charities. But unfortunately, common sense is too often lost to overzealous enforcement. In general, the public sector should not stand in the way of private charities and charitable acts, but real generosity has little value to those who press for domination by the state.

Authoritarian Designs

31 Sunday Jan 2016

Posted by Nuetzel in Progressivism, racism, Uncategorized

≈ Leave a comment

Tags

Bernie Sanders, Child Quotas, CRISPR, Davis Bacon Act, Eugenics, Friedrich Hayek, John Stewart Mill, Jonah Goldberg, Kevin Drum, Minimum Wage, Mother Jones, Obamacare Effectiveness Research, Progressivism, racism, Scientism, Sterilization, Tyler Cowen

eugenics certificate

Why condemn today’s progressives for their movement’s early endorsement of eugenics? Kevin Drum at Mother Jones thinks this old association is now irrelevant. He furthermore believes that eugenics is not an important issue in the modern world. Drum’s remarks were prompted by Jonah Goldberg’s review of Illiberal Reformers, a book by Thomas Leonard on racism and eugenicism in the American economics profession in the late 19th century. Tyler Cowen begs to differ with Drum on both counts, but for reasons that might not have been obvious to Drum. Eugenics is not a bygone, and its association with progressivism is a reflection of the movement’s broader philosophy of individual subservience to the state and, I might add, the scientism that continues to run rampant among progressives.

Cowen cites John Stewart Mill, one of the great social thinkers of the 19th century, who was an advocate for individual liberty and a harsh critic of eugenics. Here is a great paragraph from Cowen:

“The claim is not that current Progressives are evil or racist, but rather they still don’t have nearly enough Mill in their thought, and not nearly enough emphasis on individual liberty. Their continuing choice of label seems to indicate they are not much bothered by that, or maybe not even fully aware of that. They probably admire Mill’s more practical reform progressivism quite strongly, or would if they gave it more thought, but they don’t seem to relate to the broader philosophy of individual liberty as it surfaced in the philosophy of Mill and others. That’s a big, big drawback and the longer history of Progressivism and eugenics is perhaps the simplest and most vivid way to illuminate the point. This is one reason why the commitment of the current Left to free speech just isn’t very strong.“

Eugenics is not confined to the distant past, as Cowen notes, citing more recent “progressive” sterilization programs in Sweden and Canada, as well as the potential use of DNA technologies like CRISPR in “designing” offspring. That’s eugenics. So is the child quota system practiced in China, sex-selective abortion, and the easy acceptance of aborting fetuses with congenital disorders. Arguably, Obamacare “effectiveness research” guidelines cut close to eugenicism by proscribing certain treatments to individuals based upon insufficient “average benefit”, which depends upon age, disability, and stage of illness. Obamacare authorizes that the guidelines may ultimately depend on gender, race and ethnicity. All of these examples illustrate the potential for eugenics to be practiced on a broader scale and in ways that could trample individual rights.

Jonah Goldberg also responded to Drum in “On Eugenics and White Privilege“. (You have to scroll way down at the link to find the section with that title.) Goldberg’s most interesting points relate to the racism inherent in the minimum wage and the Davis-Bacon Act, two sacred cows of progressivism with the same original intent as eugenics: to weed out “undesirables”, either from the population or from competing in labor markets. It speaks volumes that today’s progressives deny the ugly economic effects of these policies on low-skilled workers, yet their forebears were counting on those effects.

Scientism is a term invoked by Friedrich Hayek to describe the progressive fallacy that science and planning can be used by the state to optimize the course of human affairs. However, the state can never command all the information necessary to do so, particularly in light of the dynamism of information relating to scarcity and preferences; government has trouble enough carrying out plans that merely match the static preferences of certain authorities. Historically, such attempts at planning have created multiple layers of tragedy, as individual freedoms and material well-being were eroded. Someone should tell Bernie Sanders!

Eugenics fit nicely into the early progressive view, flattering its theorists with the notion that the human race could be made… well, more like them! Fortunately, eugenics earned its deservedly bad name, but it continues to exist in somewhat more subtle forms today, and it could take more horrific forms in the future.

Two earlier posts on Sacred Cow Chips dealt at least in part with eugenics: “Child Quotas: Family as a Grant of Privilege“, and “Would Heterosexuals Select For Gay Genes?“.

 

.

Horizons Lost To Coercive Intervention

27 Wednesday Jan 2016

Posted by Nuetzel in Human Welfare, Price Controls, Regulation

≈ Leave a comment

Tags

Allocation of Resources, Don Boudreaux, Foregone Alternatives, Frederic Bastiat, Luddites, Minimum Wage, Opportunity Costs, Price Ceilings, Price Controls, Price floors, Rent Control, Scientism, Unintended Consequences, What is Not Seen

ceiling prices

Every action has a cost. When you’re on the hook, major decisions are obviously worth pondering. But major societal decisions are often made by agents who are not on the hook, with little if any accountability for long-term consequences. They have every incentive to discount potential downside effects, especially in the distant future. Following Frederic Bastiat, Don Boudreaux writes of three levels of “What Is Not Seen” as a consequence of human decisions, which I summarize here:

  1. Immediate foregone alternatives: Possession, use and enjoyment of X is not seen if you buy Y.
  2. Resources not directed to foregone alternatives: The reduction in X inventory is not seen, compensating production of X is not seen, and extra worker hours, capital use and flow of raw materials needed for X production are not seen.
  3. The future implied by foregone alternatives: Future impacts can take many forms. X might have been a safer or healthier alternative, but those benefits are unseen. X might have been lower quality, so the potential frustration and repairs are unseen. X might have been less expensive, but the future benefits of the money saved are unseen. All of these “unseens” have implications for the future world experienced by the decision-maker and others.

These effects take on much more significance in multiples, but (2) and (3) constitute extended unseen implications for society at large. In multiples, the lost (unseen) X production and X labor-hours, capital and raw materials are more obvious to the losers in the X industry than the winners in the Y industry, but they matter. In the future, no vibrant X industry will not be seen; the resources diverted to meet Y demand won’t be seen at new or even old X factories. X might well vanish, leaving only nontransformable detritus as a token of its existence.

Changes in private preferences or in production technologies create waves in the course of the “seen” reality and the “unseen” world foregone. Those differences are caused by voluntary, private choice, so gains are expected to outweigh losses relative to the “road not traveled”. That’s not a given, however, when decisions are imposed by external authorities with incentives unaligned with those in their thrall. For that reason, awareness of the unseen is of great importance in policy analysis, which is really Boudreaux’s point. Here is an extreme example he offers in addressing the far-reaching implications of government intrusions:

“Suppose that Uncle Sam in the early 20th century had, with a hypothetical Ludd Act, effectively prohibited the electrification of American farms, businesses, and homes. That such a policy would have had a large not-seen element is evident even to fans of Bernie Sanders. But the details of this not-seen element would have been impossible today even to guess at with any reliability. Attempting to quantify it econometrically would be an exercise in utter futility. No one in a 2015 America that had never been electrified could guess with any sense what the Ludd Act had cost Americans (and non-Americans as well). The not-seen would, in such a case, loom so large and be so disconnected to any known reality that it would be completely mysterious.“

Price regulation provides more familiar examples. Rent controls intended to “protect” the public from landlords have enormous “unintended” consequences. Like any price regulation, rent controls stifle exchange, reducing the supply and quality of housing. Renters are given an incentive to remain in their units, and property owners have little incentive to maintain or upgrade their properties. Deterioration is inevitable, and ultimately displacement of renters. The unseen, lost world would have included more housing, better housing, more stable neighborhoods and probably less crime.

A price floor covered by Boudreaux is the minimum wage. The fully predictable but unintended consequences include immediate losses in some combination of jobs, hours, benefits, and working conditions by the least-skilled class of workers. Higher paid workers feel the impact too, as they are asked to perform more (and less complex) tasks or are victimized by more widespread substitution of capital for labor. Consumers also feel some of the pain in higher prices. The net effect is a reduction in mutually beneficial trade that continues and may compound with time:

“As the time span over which obstructions to certain economic exchanges lengthens, the exchanges that would have, but didn’t, take place accumulate. The businesses that would have been created absent a minimum wage – but which, because of the minimum wage, are never created – grow in number and variety. The instances of on-the-job worker training that would have occurred – but, because of the minimum wage, didn’t occur – stack up increasingly over time.“

Regulation and taxation of all forms have such destructive consequences, but policy makers seldom place a heavy weight on the unobserved counterfactual. Boudreaux emphasizes the futility of quantifying the “unseen” effects these policies:

“… those who insist that only that which can be measured and quantified with numerical data is real must deny, as a matter of their crabbed and blinding scientism, that such long-term effects … are not only not-seen but also, because they are not-seen, not real.“

The trade and welfare losses of coercive interventions of all types are not hypothetical. They are as real as the losses caused by destruction of property by vandals. Never again can the owners enjoy the property as they once had. Future pleasures are lost and cannot be observed or measured objectively. Even worse, when government disrupts economic activity, the cumulative losses condemn the public to a backward world that they will find difficult to recognize as such.

 

A Cooked-Up Climate Consensus

14 Tuesday Jul 2015

Posted by Nuetzel in Global Warming

≈ 3 Comments

Tags

97% Consensus, AGW, Anthropomorphic Global Warming, Climate Change, Climate change consensus, Climate fraud, Ian Plimer, John Cook, Matt Ridley, Peer Review Process, Richard Tol, Scientism, University of Queensland

Settled-Science

Consensus: the world is flat; the science is settled. Consensus: the earth is at the center of the universe; the science is settled. Consensus: bloodletting can cure diseases; the science is settled. Did these ideas truly represent scientific consensus? They probably thought so at the time, but it’s more likely that they derived from long- and widely-held assumptions that had never been tested adequately via scientific methods. It might have been difficult, if not impossible, to test those propositions using the methods available at the time. There are certainly other examples of  “settled science” that were later revised, such as certain aspects of Newtonian physics.

The so-called “consensus” on climate change is similar to the first few “scientistic” assertions above, except that it’s a much less honest mistake. The most prominent claim about it is that 97% of climate scientists agree that humans have contributed to global warming. That is incorrect in several ways. Its genesis is a 2013 paper by John Cook of the University of Queensland. Richard Tol of the University of Sussex examines the facts surrounding the Cook paper in “Global warming consensus claim does not stand up“. The claim itself is a misrepresentation of Cook’s findings, according to Tol:

“The 97% refers to the number of papers, rather than the number of scientists. The alleged consensus is about any human role in climate change, rather than a dominant role….“

It is well known that the peer review process in the climate research community was fundamentally corrupt during the period covered by Cook’s examination of the literature. Papers submitted to academic journals by climate “dissenters” were often shut out, which would have biased Cook’s findings even if his review had been conducted honestly. Tol goes on to note the distortions introduced by Cook’s research, including a non-representative sample of papers:

“The sample was padded with irrelevant papers. An article about TV coverage on global warming was taken as evidence for global warming. In fact, about three-quarters of the papers counted as endorsements had nothing to say about the subject matter.“

It gets even worse:

“Cook enlisted a small group of environmental activists to rate the claims made by the selected papers. Cook claims that the ratings were done independently, but the raters freely discussed their work. There are systematic differences between the raters. Reading the same abstracts, the raters reached remarkably different conclusions – and some raters all too often erred in the same direction. Cook’s hand-picked raters disagreed what a paper was about 33% of the time. In 63% of cases, they disagreed about the message of a paper with the authors of that paper.“

On top of all that, Cook was uncooperative when asked to make his data available to other researchers. Apparently a hacker obtained the data, which revealed a highly questionable data collection process (and that Cook had lied regarding the existence of time stamps on the surveys):

“After collecting data for 8 weeks, there were 4 weeks of data analysis, followed by 3 more weeks of data collection. The same people collected and analysed the data. After more analysis, the paper classification scheme was changed and yet more data collected.“

In short, the Cook research upon which the 97% claim is based is trash. There are a number of points upon which climate researchers can largely agree in principle, including the fact that greenhouse gases would warm the planet, but only if ceteris paribus is invoked. There are many feedback effects and confounding influences that change the relationship, and the actual time span of data that can be brought to bear on the issue is strikingly short to justify bold conclusions. Unfortunately, the research environment is so politicized that even the data itself is subject to manipulation. Astonishingly, many assertions about the actual climate are, in fact, based on model output, not actual data!

There is strong disagreement at the highest levels of the scientific community regarding the balance of the evidence on climate change and whether it justifies radical policy change. Matt Ridley examines this issue in “The Climate Wars’ Damage To Science“:

“Today’s climate science, as Ian Plimer points out in his chapter in The Facts, is based on a ‘pre-ordained conclusion, huge bodies of evidence are ignored and analytical procedures are treated as evidence’. Funds are not available to investigate alternative theories. Those who express even the mildest doubts about dangerous climate change are ostracised, accused of being in the pay of fossil-fuel interests or starved of funds; those who take money from green pressure groups and make wildly exaggerated statements are showered with rewards and treated by the media as neutral.“

Ridley goes on to recount the litany of scandals that have erupted within the climate establishment over the past few years. It is well worth reading, but ultimately these developments can’t help but damage science, its reputation with the public, and its usefulness to mankind.

← Older posts
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Oh To Squeeze Fiscal Discipline From a Debt Limit Turnip
  • Conformity and Suppression: How Science Is Not “Done”
  • Grow Or Collapse: Stasis Is Not a Long-Term Option
  • Cassandras Feel An Urgent Need To Crush Your Lifestyle
  • Containing An Online Viper Pit of Antisemites

Archives

  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Ominous The Spirit
  • Passive Income Kickstart
  • onlyfinance.net/
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library

Blog at WordPress.com.

Ominous The Spirit

Ominous The Spirit is an artist that makes music, paints, and creates photography. He donates 100% of profits to charity.

Passive Income Kickstart

onlyfinance.net/

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

  • Follow Following
    • Sacred Cow Chips
    • Join 121 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...