• About

Sacred Cow Chips

Sacred Cow Chips

Tag Archives: Matt Ridley

Conformity and Suppression: How Science Is Not “Done”

26 Thursday Jan 2023

Posted by Nuetzel in Political Bias, Science

≈ Leave a comment

Tags

Breakthrough Findings, Citation Politics, Citation Practices, Climate science, Conformist Science, Covid Lockdowns, Disruptive Science, Mary Worley Montagu, Matt Ridley, NASA, Nature Magazine, Politicized Science, President Dwight Eisenhower, Public Health, Scientism, Scott Sumner, Steven F. Hayward, Wokeness

I’m not terribly surprised to learn that scientific advancement has slowed over my lifetime. A recent study published in the journal Nature documented a secular decline in the frequency of “disruptive” or “breakthrough” scientific research across a range of fields. Research has become increasingly dominated by “incremental” findings, according to the authors. The graphic below tells a pretty dramatic story:

The index values used in the chart range “from 1 for the most disruptive to -1 for the least disruptive.” The methodology used to assign these values, which summarize academic papers as well as patents, produces a few oddities. Why, for example, does the tech revolution of the last 40 years create barely a blip in the technology index in the chart above? And why have tech research and social science research always been more “disruptive” than other fields of study?

Putting those questions aside, the Nature paper finds trends that are basically consistent across all fields. Apparently, systematic forces have led to declines in these measures of breakthrough scientific findings. The authors try to provide a few explanations as to the forces at play: fewer researchers, incrementalism, and a growing role of large-team research that induces conformity. But if research has become more incremental, that’s more accurately described as a manifestation of the disease, rather than a cause.

Conformity

Steven F. Hayward skewers the authors a little, and perhaps unfairly, stating a concern held by many skeptics of current scientific practices. Hayward says the paper:

“… avoids the most significant and obvious explanation with the myopia of Inspector Clouseau, which is the deadly confluence of ideology and the increasingly narrow conformism of academic specialties.”

Conformism in science is nothing new, and it has often interfered with the advancement of knowledge. The earliest cases of suppression of controversial science were motivated by religious doctrine, but challenges to almost any scientific “consensus” seem to be looked upon as heresy. Several early cases of suppression are discussed here. Matt Ridley has described the case of Mary Worley Montagu, who visited Ottoman Turkey in the early 1700s and witnessed the application of puss from smallpox blisters to small scratches on the skin of healthy subjects. The mild illness this induced led to immunity, but the British medical establishment ridiculed her. A similar fate was suffered by a Boston physician in 1721. Ridley says:

“Conformity is the enemy of scientific progress, which depends on disagreement and challenge. Science is the belief in the ignorance of experts, as [the physicist Richard] Feynman put it.”

When was the Scientific Boom?

I couldn’t agree more with Hayward and Ridley on the damaging effects of conformity. But what gave rise to our recent slide into scientific conformity, and when did it begin? The Nature study on disruptive science used data on papers and patents starting in 1945. The peak year for disruptive science within the data set was … 1945, but the index values were relatively high over the first two decades of the data set. Maybe those decades were very special for science, with a variety of applications and high-profile accomplishments that have gone unmatched since. As Scott Sumner says in an otherwise unrelated post, in many ways we’ve failed to live up to our own expectations:

“In retrospect, the 1950s seem like a pivotal decade. The Boeing 707, nuclear power plants, satellites orbiting Earth, glass walled skyscrapers, etc., all seemed radically different from the world of the 1890s. In contrast, airliners of the 2020s look roughly like the 707, we seem even less able to build nuclear power plants than in the 1960s, we seem to have a harder time getting back to the moon than going the first time, and we still build boring glass walled skyscrapers.”

It’s difficult to put the initial levels of the “disruptiveness” indices into historical context. We don’t know whether science was even more disruptive prior to 1945, or how the indices used by the authors of the Nature article would have captured it. And it’s impossible to say whether there is some “normal” level of disruptive research. Is a “normal” index value equal to zero, which we now approach as an asymptote?

Some incredible scientific breakthroughs occurred decades before 1945, to take Einstein’s theory of relativity as an obvious example. Perhaps the index value for physical sciences would have been much higher at that time, were it measured. Whether the immediate post-World War II era represented an all-time high in scientific disruption is anyone’s guess. Presumably, the world is always coming from a more primitive base of knowledge. Discoveries, however, usually lead to new and deeper questions. The authors of the Nature article acknowledge and attempt to test for the “burden” of a growing knowledge base on the productivity of subsequent research and find no effect. Nevertheless, it’s possible that the declining pattern after 1945 represents a natural decay following major “paradigm shifts” in the early twentieth century.

The Psychosis Now Known As “Wokeness”

The Nature study used papers and patents only through 2010. Therefore, the decline in disruptive science predates the revolution in “wokeness” we’ve seen over the past decade. But “wokeness” amounts to a radicalization of various doctrines that have been knocking around for years. The rise of social justice activism, critical theory, and anthropomorphic global warming theology all began long before the turn of the century and had far reaching effects that extended to the sciences. The recency of “wokeness” certainly doesn’t invalidate Hayward and Ridley when they note that ideology has a negative impact on research productivity. It’s likely, however, that some fields of study are relatively immune to the effects of politicization, such as the physical sciences. Surely other fields are more vulnerable, like the social sciences.

Citations: Not What They Used To Be?

There are other possible causes of the decline in disruptive science as measured by the Nature study, though the authors believe they’ve tested and found these explanations lacking. It’s possible that an increase in collaborative work led to a change in citation practices. For example, this study found that while self-citation has remained stable, citation of those within an author’s “collaboration network” has declined over time. Another paper identified a trend toward citing review articles in Ecology Journals rather than the research upon which those reviews were based, resulting in incorrect attribution of ideas and findings. That would directly reduce the measured “disruptiveness” of a given paper, but it’s not clear whether that trend extends to other fields.

Believe it or not, “citation politics” is a thing! It reflects the extent to which a researcher should suck-up to prominent authors in a field of study, or to anyone else who might be deemed potentially helpful or harmful. In a development that speaks volumes about trends in research productivity, authors are now urged to append a “Citation Diversity Statement” to their papers. Here’s an academic piece addressing the subject of “gendered citation practices” in contemporary physics. The 11 authors of this paper would do well to spend more time thinking about problems in physics than in obsessing about whether their world is “unfair”.

Science and the State

None of those other explanations are to disavow my strong feeling that science has been politicized and that it is harming our progress toward a better world. In fact, it usually leads us astray. Perhaps the most egregious example of politicized conformism today is climate science, though the health sciences went headlong toward a distinctly unhealthy conformism during the pandemic (and see this for a dark laugh).

Politicized science leads to both conformism and suppression. Here are several channels through which politicization might create these perverse tendencies and reduce research productivity or disruptiveness:

  • Political or agenda-driven research is driven by subjective criteria, rather than objective inquiry and even-handed empiricism
  • Research funding via private or public grants is often contingent upon whether the research can be expected to support the objectives of the funding NGOs, agencies, or regulators. The gravy train is reserved for those who support the “correct” scientific narrative
  • Promotion or tenure decisions may be sensitive to the political implications of research
  • Government agencies have been known to block access to databases funded by taxpayers when a scientist wishes to investigate the “wrong questions”
  • Journals and referees have political biases that may influence the acceptance of research submissions, which in turn influences the research itself
  • The favorability of coverage by a politicized media influences researchers, who are sensitive to the damage the media can do to one’s reputation
  • The influence of government agencies on media treatment of scientific discussion has proven to be a potent force
  • The chance that one’s research might have a public policy impact is heavily influenced by politics
  • The talent sought and/or attracted to various fields may be diminished by the primacy of political considerations. Indoctrinated young activists generally aren’t the material from which objective scientists are made

Conclusion

In fairness, there is a great deal of wonderful science being conducted these days, despite the claims appearing in the Nature piece and the politicized corruption undermining good science in certain fields. Tremendous breakthroughs are taking place in areas of medical research such as cancer immunotherapy and diabetes treatment. Fusion energy is inching closer to a reality. Space research is moving forward at a tremendous pace in both the public and private spheres, despite NASA’s clumsiness.

I’m sure there are several causes for the 70-year decline in scientific “disruptiveness” measured in the article in Nature. Part of that decline might have been a natural consequence of coming off an early twentieth-century burst of scientific breakthroughs. There might be other clues related to changes in citation practices. However, politicization has become a huge burden on scientific progress over the past decade. The most awful consequences of this trend include a huge misallocation of resources from industrial planning predicated on politicized science, and a meaningful loss of lives owing to the blind acceptance of draconian health policies during the Covid pandemic. When guided by the state or politics, what passes for science is often no better than scientism. There are, however, even in climate science and public health disciplines, many great scientists who continue to test and challenge the orthodoxy. We need more of them!

I leave you with a few words from President Dwight Eisenhower’s Farewell Address in 1961, in which he foresaw issues related to the federal funding of scientific research:

“Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.”

Renewables and Preempted Prosperity

10 Wednesday Jul 2019

Posted by Nuetzel in Central Planning, Renewable Energy

≈ Leave a comment

Tags

carbon Sensitivity, David Middleton, Economic Cost of Carbon, Fossil fuels, Intermittancy, John Barry, Los Angeles Eland Project, Martin Heidegger, Matt Ridley, Michael Schellenberger, Murray Bookchin, Renewable energy

Coerced conversion to renewable energy sources will degrade human living conditions. That’s certainly true relative to a voluntary conversion actuated by purely private incentives. It’s likely to be true even in an absolute sense, depending on the speed and severity of the forced transition. A coerced conversion will mean lower real incomes during the transition (one recent estimate: $42,000 total loss per U.S. household to transition by 2030), and the losses will continue after the transition, with little redeeming improvement in environmental conditions or risk.

The Reality

There are several underpinnings for the assertions above. One is that the sensitivity of global temperatures to carbon forcings is relatively low. We know all too well that the climate models relied upon by warming alarmists have drastically over-estimated the extent of warming to date. The models are excessively sensitive to carbon emissions and promote an unwarranted urgency to DO SOMETHING… with other people’s money. There is also the question of whether moderate warming is really a bad thing given that it is likely to mean fewer cold-weather fatalities, increased agricultural productivity, and significant reforestation.

Another underpinning is that the real economics of renewable energy are vastly inferior to fossil fuels and will remain so for some time to come. Proponents of renewables tend to quote efficiencies under optimal operating conditions, free of pesky details like the cost of installing a vast support infrastructure and environmental costs of producing components. Solar and wind energy are tremendously inefficient in terms of land use. One estimate is that meeting a 100% renewable energy target in the U.S. today would require acreage equivalent to the state of California. And of course rare earth minerals must be mined for wind turbines and solar panels, and fossil fuels are needed to produce materials like the steel used to build them.

But the chief renewable bugaboo is that the power generated by wind and solar is intermittent. Our ability to store power is still extremely limited, so almost all surplus energy production is lost. Therefore, intermittency necessitates redundant generating capacity, which imposes huge costs. When the winds are calm and the sun isn’t shining, traditional power sources are needed to meet demand. That redundant capacity must be maintained and kept on-line, as these facilities are even costlier to power up from a dead start.

LA Hucksterism

These issues are typified by the unrealistic expectations of Los Angeles’ plan to replace 7% of the city’s power consumption with renewables. The cost predicted by LA regulators is slightly less than 2 cents per kilowatt hour for solar and even less for battery power, which are unrealistically low. For one thing, those are probably operating costs that do not account for capital requirements. The plan promises to provide power 16 hours a day at best, but it’s not clear that the 7% estimate of the renewable share takes that into account or whether the real figure should be 4.2% of LA’s power needs. The project will require 2,600 acres for solar panels, and if it’s like other solar plant installations, the stated capacity is based on the few hours of the day when the sun’s rays are roughly perpendicular to the panels. So it’s likely that the real cost of the power will be many times the estimates, though taxpayers will subsidize 30% or more of the total. And then there is the negative impact on birds and other wildlife.

The Question of Intent

Michael Schellenberger goes so far as to say that a degraded standard of living is precisely what many fierce renewable advocates have long intended. Modern comforts are simply not compatible with 100% renewable energy any time soon, or perhaps ever given the investment involved, but a target of 100% was not really intended to be compatible with modern comforts. In fact, the renewable proposition was often intermingled with celebration of a more austere, agrarian lifestyle. Schellenberger discusses the case of Martin Heidegger, an early anti-technologist who said in 1954 that modern technology “puts to nature the unreasonable demand that it supply energy....” Of course, Heidegger was not talking about the use of solar panels. Others, like Murray Bookchin, were ultimately quite explicit about the “promise” of renewables to dial-back industrial society in favor of an agrarian ideal. And here’s a quote from a new book by John Barry, Professor of “Green Political Economy” (!) at Queen’s University Belfast:

“The first question which serves as the starting point of this chapter is to ask if the objective of economic growth is now ecologically unsustainable, socially divisive and has in many countries passed the point when it is adding to human wellbeing?”

If that’s the question, the answer is no! The quote is courtesy of David Middleton. Green Professor Barry has one thing right, however: growing anything will be tough after crowding erstwhile farm and forest land with solar panels and wind turbines. But at least someone “green” is willing to admit some economic realities, something many alarmists and politicians are loath to do.

Welfare Loss

Involuntary actions always involve a welfare loss, as “subjects” must sacrifice the additional value they’d otherwise derive from their own choices. So it is that coerced adoption of renewables implies a starker outcome than zero economic growth. Objective measurement of all welfare costs is difficult, but we know that the adoption of renewables implies measurable up-front and ongoing economic losses. Matt Ridley notes that the impact of those losses falls hardest on the poor, whose energy needs absorb a large fraction of income. This, along with fundamental impracticality and high costs, accounts for the populist backlash against radical efforts to promote renewables in some European states. The politics of forced adoption of renewables is increasingly grim, but attempts to sell a centrally-planned energy sector based on renewables continue.

Ridley is rightly skeptical of carbon doomsday scenarios, but the pressure to curb carbon emissions will remain potent. He advocates a different form of intervention: essentially a carbon tax on producers with proceeds dedicated to new, competing sequestration or carbon capture technologies. Still coercive, the tax itself requires an estimate of the “economic cost of carbon”, which is of tremendously uncertain magnitude. The tax, of course, has the potential to do real harm to the economy. On the other hand, Ridley is correct in asserting that the effort to fund competing carbon-capture projects would leverage powerful market forces and perhaps hasten breakthroughs.

Mandated Misery

The attempt to force a complete conversion to renewable energy sources is meeting increasing political challenges as its cost is revealed more clearly by experience. Alarmists have long recognized the danger of economic damage, however. Thus, they try to convince us that economic growth and our current standards of living aren’t as good as we think they are, and they continue to exaggerate claims about the promise of renewable technologies. One day, some of these technologies will be sufficiently advanced that they will be economically viable without taxpayer subsidies. The conversion to renewables should be postponed until that day, when users can justify the switch in terms of costs and benefits, and do so voluntarily without interference by government planners.

The UN’s Mass Extinction Fiction

20 Monday May 2019

Posted by Nuetzel in Biodiversity, Central Planning, Environment

≈ Leave a comment

Tags

African Elephants, Beepocalypse, Biodiversity, Bird Eater Tarantulas, CO2 Emissions, Dan Hannon, Extinction, Gary Wrightstone, Global Greening, Habitat Loss, IPCC, IUCN Red List, Jimmy Carter, Matt Ridley, Non-Native Species, Paris Accord, Polar Bears

A big story early this month warned of mass extinctions and a collapse of the planet’s biodiversity. This was based on a report by the UN’s Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES). A high-level presentation of the data by IPBES was constructed in a way that is easily revealed as misleading (see below). But the first thing to ask about bombastic reports like this is whether the authors are self-interested. There is big money in promoting apocalyptic scenarios and public programs to avert them. Large government grants are at stake for like-minded scientists, and political power is at stake for biodiversity activists worldwide. Like many other scare stories reported as “news”, this one feeds into the statist political agenda of the environmental Left.

Exaggerated claims of species endangerment are not a new phenomenon. We’ve heard grossly erroneous forecasts of polar bear extinctions, frightening but false warnings of a “beepocalyse”, and faulty claims about declines in the population of African elephants. These are headline-grabbing and more thrilling to report than mourning the prospective loss of an obscure species of cave lichen. But a mass extinction is something else! Dan Hannon reminds us of the following:

“In 1980, for example, the Jimmy Carter administration distributed to foreign governments a report claiming that, by the year 2000, 2 million species would be wiped out. In fact, by 2010, there had been 872 documented extinctions.” 

Of course, that figure does not account for the multitude of new species discovered. There are many. Recent examples just gruesome enough to garner attention are the three new species of bird eater tarantulas discovered in 2017.

In the more general mass-extinction context of the IPBES report, the blame for the extremely pessimistic outlook is placed squarely on human activity. The authors allege CO2 emissions as the primary culprit, which is at best a theory and one at odds with the chief driver of extinctions during the industrial era. That is the introduction of non-native species into environments having flora or fauna unable to withstand new competitors. Matt Ridley elaborates:

“The introduction by people of predators, parasites and pests, especially to islands, has been and continues to be far and away the greatest cause of local and global extinction of native fauna.”

There is no question that the IPBES report on extinctions was intended to create alarm. As Gary Wrightstone demonstrates, the lack of rigor and misleading expositional techniques used in the report are a tell:

“… the data were lumped together by century rather than shorter time frames, which, as we shall see accentuates the supposed increase in extinctions. … The base data were derived from the International Union for Conservation of Nature and Natural Resources (IUCN) Red List, which catalogues every known species that has gone the way of the dodo and the carrier pigeon. Review of the full data set reveals a much different view of extinction and what has been happening recently.”

The more granular charts Wrightstone presents are indeed contrary to the narrative in the IPBES report. And Wrightstone also highlights the following in a postscript:

“In an incredibly ironic twist that poses a difficult conundrum for those who are intent on saving the planet from our carbon dioxide excesses, the new study reports that the number one cause of predicted extinctions is habitat loss. Yet their solution is to pave over vast stretches of land for industrial scale solar factories and to construct immense wind factories that will cover forests and   grasslands, killing the endangered birds and other species they claim to want to save.”

The enduring extinction racket is one among other fronts in the war on capitalism. The IPBES report must use the term “transformative” a thousand times, as it recommends “steering away from the current limited paradigm of economic growth“. Matt Ridley highlights the faulty attribution of alleged declines in biodiversity to “western values and capitalism”:

“On the whole what really diminishes biodiversity is a large but poor population trying to live off the land. As countries get richer and join the market economy they generally reverse deforestation, slow species loss and reverse some species declines.”

And Ridley also says this:

“A favourite nostrum of many environmentalists is that you cannot have infinite growth with finite resources. But this is plain wrong, because economic growth comes from doing more with less. So if I invent a new car engine that gets twice as many miles per gallon, I’ve caused economic growth but we’ll use less fuel. Likewise if I increase the yield of a crop, I need less land and probably less fuel too.”

It’s no coincidence that future extinctions foretold by IPBES are predicted to have drastic impacts on less-developed countries. It thus appears that IPBES exists in a happy synergy with the UN’s climate Intergovernmental Panel on Climate Change (IPCC), as well as proponents of the Paris Accord and the entire climate lobby. An objective that helps them garner support around the globe is to redistribute existing wealth to less-developed countries in the name of environmental salvation. That would prove a poor substitute for the kinds of free-market policies that would truly enhance prospects for economic growth in those nations.

The threat of mass extinctions is greatly exaggerated by the UN, IPBES, climate change activists, and members of the media who can’t resist promoting a crisis. Any diminished biodiversity we might experience going forward won’t be solved by limiting economic growth, as the IPBES report claims. Instead, advances in productivity, particularly in agriculture, can allow expansion of native habitat, as recent experience with reforestation and global greening demonstrates. This principle is as applicable to under-developed countries as anywhere else.

The kinds of centrally planned limits on human activity contemplated by the IPBES report are likely to backfire by making us poorer. Those limits would impose costs by misallocating resources away from things that people value most highly. They would also force people to forego the adoption of innovative production techniques, leading to the substitution of other resources, such as inefficient land use. And those limits would deny basic freedoms, including the unfettered use of private property.

The Bad News Industrial Complex

20 Friday Apr 2018

Posted by Nuetzel in Big Government, Corruption, Risk Management

≈ Leave a comment

Tags

Beepocalypse, Cronyism, Matt Ridley, NASA, News Media, Oxfam, Precautionary Principle, rent-seeking behavior, Risk Aversion, Risk Mitigation, The Lancet

Matt Ridley had an interesting piece on his blog last month entitled “Bad News Is Sudden, Good News Is Gradual“. It’s about the timing of news, as stated, and it’s about our bias toward bad news more generally. There is no question that bad news tends to be more dramatic than good news. But with steadily increasingly lifespans, growing prosperity, and world poverty at an all-time low, surely good news must come as much or more frequently than bad. But good news can be inconvenient to certain narratives. It is therefore often ignored, and some other purported disaster is found as a substitute:

“Poverty and hunger are the business Oxfam is in, but has it shouted the global poverty statistics from the rooftops? Hardly. It has switched its focus to inequality. When The Lancet published a study in 2010 showing global maternal mortality falling, advocates for women’s health tried to pressure it into delaying publication ‘fearing that good news would detract from the urgency of their cause’, The New York Times reported. The announcement by Nasa in 2016 that plant life is covering more and more of the planet as a result of carbon dioxide emissions was handled like radioactivity by most environmental reporters.“

Tales of bad outcomes can be alluring, especially if they haven’t happened yet. In fact, bad things might even happen gradually, but dark visions of a world beyond the horizon impart a spooky sense of immediacy, and indeed, urgency. Ridley notes the tendency of people to regard pessimists as “wise”, while optimists are viewed as Pollyannas. And he recognizes that risk aversion plays an important role in this psychology. That brings me to the point I found most interesting in Ridley’s piece: the many vested interests in disasters, and disasters foretold.

Risk management is big business in an affluent society. There is a lot to lose, and a squeamish populace is easily cowed by good scare stories. The risk management and disaster-prevention narrative can be wrapped around any number of unlikely or exaggerated threats, serving the interests of the administrative state and private rent-seekers. One particular tool that has been most useful to this alliance is the precautionary principle. It is invoked to discourage or regulate activities presumed to pose risks to the public or to the environment. But there are three dimensions to the application of the precautionary principle: it provides a rationale for public funding of research into the risk-du-jour, for funding projects designed to mitigate its consequences, and for subsidizing development of alternative technologies that might help avoid or reduce the severity of the risk, often at great expense. The exaggeration of risk serves to legitimize these high costs. Of course, the entire enterprise would be impossible without the machinery of the state, in all its venality. Where money flows, graft is sure to follow.

Well-publicized disaster scenarios are helpful to statists in other ways. Risk, its causes, and its consequences are not distributed evenly across regions and populations. A risk thought to be anthropomorphic in nature implies that wealthier and more productive communities and nations must shoulder the bulk of the global costs of mitigation. Thus, the risk-management ethic requires redistribution. Furthermore, wealthier regions are better situated to insulate themselves locally against many risks. Impoverished areas, on the other hand, must be assisted. Finally, an incredible irony of our preoccupation with disaster scenarios is the simultaneous effort to subsidize those deemed most vulnerable even while executing other policies that harm them.

Media organizations and their newspeople obviously benefit greatly from the subtle sensationalism of creeping disaster. As Ridley noted, the gradualism of progress is no match for a scare story on the nightly news. There is real money at stake here, but the media is driven not only by economic incentives. In fact, the dominant leftist ideology in media organizations means that they are more than happy to spread alarm as part of a crusade for state solutions to presumed risks. There are even well-meaning users of social media who jump at the chance to signal their virtue by reposting memes and reports that are couched not merely in terms of risks, but as dire future realities.

Mitigating social risks is a legitimate function of government. Unfortunately, identifying and exaggerating risks, and suppressing contradictory evidence, is in the personal interest of politicians, bureaucrats, crony capitalists, and many members of the media. Everything seems to demand government intervention. Carbon concentration, global warming and sea level changes are glaring examples of exaggerated risks. As Ridley says,

“The supreme case of unfalsifiable pessimism is climate change. It has the advantage of decades of doom until the jury returns. People who think the science suggests it will not be as bad as all that, or that humanity is likely to mitigate or adapt to it in time, get less airtime and a lot more criticism than people who go beyond the science to exaggerate the potential risks. That lukewarmers have been proved right so far cuts no ice.”

Other examples include the “beepocalypse“, genetic modification, drug use, school shootings, and certain risks to national security. Ridley offers the consequences of Brexit as well. There, I’ve listed enough sacred cows to irritate just about everyone.

In many cases, the real crises have more to do with government activism than the original issue with which they were meant to reckon. Which brings me to a discomfiting vision of my own: having allowed the administrative state to metastasize across almost every social organ and every aspect of our lives, a huge risk to our future well-being is continuing erosion of personal and economic liberties and our ability to prosper as a society. Here’s Ridley’s close:

“Activists sometimes justify the focus on the worst-case scenario as a means of raising consciousness. But while the public may be susceptible to bad news they are not stupid, and boys who cry ‘wolf!’ are eventually ignored. As the journalist John Horgan recently argued in Scientific American: ‘These days, despair is a bigger problem than optimism.'”

Ridley’s Case For Free Market Capitalism

05 Saturday Aug 2017

Posted by Nuetzel in Free markets

≈ Leave a comment

Tags

Capitalism, Corporatism, crony capitalism, Invisible Hand, Liberalism, Markets, Matt Ridley, monopoly, Profit Motive

Matt Ridley delivered an excellent lecture in July addressing a generally unappreciated distinction: markets and free enterprise vs. corporatism. Many don’t seem to know the difference. Ridley offers an insightful discussion of the very radical and liberating nature of free markets. The success of the free market system in alleviating poverty and increasing human well-being is glaringly obvious in historical perspective, but it’s become too easy for people to take market processes for granted. It’s also too easy to misinterpret outcomes in a complex society in which producers must navigate markets as well as a plethora of regulatory obstacles and incentives distorted by government.

I agree with almost everything Ridley has to say in this speech, but I think he does the language of economics no favors. I do not like his title: “The Case For Free Market Anti-Capitalism”. Free Markets are great, of course, and they are fundamental to the successful workings of a capitalistic system. Not a corporatist system, but capitalism! Ridley seems to think the latter is a dirty word. As if to anticipate objections like mine, Ridley says:

“‘Capitalism’ and ‘markets’ mean the same thing to most people. And that is very misleading. Commerce, enterprise and markets are – to me – the very opposite of corporatism and even of ‘capitalism’, if by that word you mean capital-intensive organisations with monopolistic ambitions.“

No, that is not what I mean by capitalism. Commerce, free enterprise, markets, capitalism and true liberalism all imply that you are free to make your own production and consumption decisions without interference by the state. Karl Marx coined the word “capitalism” as a derogation, but the word was co-opted long ago to describe a legitimate and highly successful form of social organization. I prefer to go on using “capitalism” as synonymous with free markets and liberalism, though the left is unlikely to abandon the oafish habit of equating liberalism with state domination.

Capital is man-made wealth, like machines and buildings. It can be used more intensively or less in production and commerce. But capitalism is underpinned by the concept of private property. You might own capital as a means of production, or you might operate an enterprise with very little capital, but the rewards of doing so belong to you. Saving those rewards by reinvesting in your business or investing in other assets allows you to accumulate capital. That’s a good way to build or expand a business that is successful in meeting the needs of its customers, and it’s a good way to provide for oneself later in life.

Capitalism does not imply monopolistic ambitions unless you incorrectly equate market success with monopoly power. Market success might mean that you are an innovator or just better at what you do than many of your competitors. It usually means that your customers are pleased. The effort to innovate or do your job well speaks to an ambition rooted in discovery, service and pride. In contrast, the businessperson with monopolistic ambitions is willing to achieve those ends by subverting normal market forces, including attempts to enlist the government in protecting their position. That’s known as corporatism, rent-seeking, and crony capitalism. It is not real capitalism, and Ridley should not confuse these terms. But he also says this:

“Free-market ideas are often the very opposite of business and corporate interests. “

Most fundamental to business interests is to earn a profit, and the profit motive is an essential feature of markets and the operation of the invisible hand that is so beneficial to society. Why Ridley would claim that business interests are inimical to free market ideals is baffling.

I hope and believe that Ridley is merely guilty of imprecision, and that he intended to convey that certain paths to profit are inconsistent with free market ideals. And in fact, he follows that last sentence with the following, which is quite right: capitalism is subverted by corporatism:

“We need to call out not just the worst examples of crony capitalism, but an awful lot of what passes for capitalism today — a creature of subsidy that lobbies governments for regulatory barriers to entry.“

And, of course, crony capitalism is not capitalism!

Now I’ll get off my soapbox and briefly return to the topic of an otherwise beautiful lecture by Ridley. He makes a number of fascinating points, including the following, which is one of the most unfortunate and paradoxical results in the history of economic and social thought:

“Somewhere along the line, we have let the market, that most egalitarian, liberal, disruptive, distributed and co-operative of phenomena, become known as a reactionary thing. It’s not. It is the most radical and liberating idea ever conceived: that people should be free to exchange goods and services with each other as they please, and thereby work for each other to improve each other’s lives.

In the first half of the 19th century this was well understood. To be a follower of Adam Smith was to be radical left-winger, against imperialism, militarism, slavery, autocracy, the established church, corruption and the patriarchy.

Political liberation and economic liberation went hand in hand. Small government was a progressive proposition. Insofar as there was a revolution during the Industrial Revolution, it was the weakening of the power of the aristocracy and the landed interests, and the liberation of the bulk of the people.“

Do read the whole thing!

Playing Pretend Science Over Cocktails

13 Thursday Apr 2017

Posted by Nuetzel in Global Warming

≈ 2 Comments

Tags

97% Consensus, AGW, Carbon Forcing Models, Climate Feedbacks, CO2 and Greening, East Anglia University, Hurricane Frequency, Judith Curry, Matt Ridley, NOAA, Paleoclimate, Peer Review Corruption, Ross McKitrick, Roy Spencer, Sea Levels, Steve McIntyre, Temperature Proxies, Urbanization Bias

It’s a great irony that our educated and affluent classes have been largely zombified on the subject of climate change. Their brainwashing by the mainstream media has been so effective that these individuals are unwilling to consider more nuanced discussions of the consequences of higher atmospheric carbon concentrations, or any scientific evidence to suggest contrary views. I recently attended a party at which I witnessed several exchanges on the topic. It was apparent that these individuals are conditioned to accept a set of premises while lacking real familiarity with supporting evidence. Except in one brief instance, I avoided engaging on the topic, despite my bemusement. After all, I was there to party, and I did!

The zombie alarmists express their views within a self-reinforcing echo chamber, reacting to each others’ virtue signals with knowing sarcasm. They also seem eager to avoid any “denialist” stigma associated with a contrary view, so there is a sinister undercurrent to the whole dynamic. These individuals are incapable of citing real sources and evidence; they cite anecdotes or general “news-say” at best. They confuse local weather with climate change. Most of them haven’t the faintest idea how to find real research support for their position, even with powerful search engines at their disposal. Of course, the search engines themselves are programmed to prioritize the very media outlets that profit from climate scare-mongering. Catastrophe sells! Those media outlets, in turn, are eager to quote the views of researchers in government who profit from alarmism in the form of expanding programs and regulatory authority, as well as researchers outside of government who profit from government grant-making authority.

The Con in the “Consensus”

Climate alarmists take assurance in their position by repeating the false claim that  97% of climate scientists believe that human activity is the primary cause of warming global temperatures. The basis for this strong assertion comes from an academic paper that reviewed other papers, the selection of which was subject to bias. The 97% figure was not a share of “scientists”. It was the share of the selected papers stating agreement with the anthropomorphic global warming (AGW) hypothesis. And that figure is subject to other doubts, in addition to the selection bias noted above: the categorization into agree/disagree groups was made by “researchers” who were, in fact, environmental activists, who counted several papers written by so-called “skeptics” among the set that agreed with the strong AGW hypothesis. So the “97% of scientists” claim is a distortion of the actual findings, and the findings themselves are subject to severe methodological shortcomings. On the other hand, there are a number of widely-recognized, natural reasons for climate change, as documented in this note on 240 papers published over just the first six months of 2016.

Data Integrity

It’s rare to meet a climate alarmist with any knowledge of how temperature data is actually collected. What exactly is the “global temperature”, and how can it be measured? It is a difficult undertaking, and it wasn’t until 1979 that it could be done with any reliability. According to Roy Spencer, that’s when satellite equipment began measuring:

“… the natural microwave thermal emissions from oxygen in the atmosphere. The intensity of the signals these microwave radiometers measure at different microwave frequencies is directly proportional to the temperature of different, deep layers of the atmosphere.“

Prior to the deployment of weather satellites, and starting around 1850, temperature records came only from surface temperature readings. These are taken at weather stations on land and collected at sea, and they are subject to quality issues that are generally unappreciated. Weather stations are unevenly distributed and they come and go over time; many of them produce readings that are increasingly biased upward by urbanization. Sea surface temperatures are collected in different ways with varying implications for temperature trends. Aggregating these records over time and geography is a hazardous undertaking, and these records are, unfortunately, the most vulnerable to manipulation.

The urbanization bias in surface temperatures is significant. According to this paper by Ross McKitrick, the number of weather stations counted in the three major global temperature series declined by more than 4,500 since the 1970s (over 75%), and most of those losses were rural stations. From McKitrick’s abstract:

“The collapse of the sample size has increased the relative fraction of data coming from airports to about 50% (up from about 30% in the late 1970s). It has also reduced the average latitude of source data and removed relatively more high altitude monitoring sites. Oceanic data are based on sea surface temperature (SST) instead of marine air temperature (MAT)…. Ship-based readings changed over the 20th century from bucket-and-thermometer to engine-intake methods, leading to a warm bias as the new readings displaced the old.“

Think about that the next time you hear about temperature records, especially NOAA reports on a “new warmest month on record”.

Data Manipulation

It’s rare to find alarmists having any awareness of the scandal at East Anglia University, which involved data falsification by prominent members of the climate change “establishment”. That scandal also shed light on corruption of the peer-review process in climate research, including a bias against publishing work skeptical of the accepted AGW narrative. Few are aware now of a very recent scandal involving manipulation of temperature data at NOAA in which retroactive adjustments were applied in an effort to make the past look cooler and more recent temperatures warmer. There is currently an FOIA outstanding for communications between the Obama White House and a key scientist involved in the scandal. Here are Judith Curry’s thoughts on the NOAA temperature manipulation.

Think about all that the next time you hear about temperature records, especially NOAA reports on a “new warmest month on record”.

Other Warming Whoppers

Last week on social media, I noticed a woman emoting about the way hurricanes used to frighten her late mother. This woman was sharing an article about the presumed negative psychological effects that climate change was having on the general public. The bogus premises: we are experiencing an increase in the frequency and severity of storms, that climate change is causing the storms, and that people are scared to death about it! Just to be clear, I don’t think I’ve heard much in the way of real panic, and real estate prices and investment flows don’t seem to be under any real pressure. In fact, the frequency and severity of severe weather has been in decline even as atmospheric carbon concentrations have increased over the past 50 years.

I heard another laughable claim at the party: that maps are showing great areas of the globe becoming increasingly dry, mostly at low latitudes. I believe the phrase “frying” was used. That is patently false, but I believe it’s another case in which climate alarmists have confused model forecasts with fact.

The prospect of rising sea levels is another matter that concerns alarmists, who always fail to note that sea levels have been increasing for a very long time, well before carbon concentrations could have had any impact. In fact, the sea level increases in the past few centuries are a rebound from lows during the Little Ice Age, and levels are now back to where the seas were during the Medieval Warm Period. But even those fluctuations look minor by comparison to the increases in sea levels that occurred over 8,000 years ago. Sea levels are rising at a very slow rate today, so slowly that coastal construction is proceeding as if there is little if any threat to new investments. While some of this activity may be subsidized by governments through cheap flood insurance, real money is on the line, and that probably represents a better forecast of future coastal flooding than any academic study can provide.

Old Ideas Die Hard

Two enduring features of the climate debate are 1) the extent to which so-called “carbon forcing” models of climate change have erred in over-predicting global temperatures, and 2) the extent to which those errors have gone unnoticed by the media and the public. The models have been plagued by a number of issues: the climate is not a simple system. However, one basic shortcoming has to do with the existence of strong feedback effects: the alarmist community has asserted that feedbacks are positive, on balance, magnifying the warming impact of a given carbon forcing. In fact, the opposite seems to be true: second-order responses due to cloud cover, water vapor, and circulation effects are negative, on balance, at least partially offsetting the initial forcing.

Fifty Years Ain’t History

One other amazing thing about the alarmist position is an insistence that the past 50 years should be taken as a permanent trend. On a global scale, our surface temperature records are sketchy enough today, but recorded history is limited to the very recent past. There are recognized methods for estimating temperatures in the more distant past by using various temperature proxies. These are based on measurements of other natural phenomenon that are temperature-sensitive, such as ice cores, tree rings, and matter within successive sediment layers such as pollen and other organic compounds.

The proxy data has been used to create temperature estimates into the distant past. A basic finding is that the world has been this warm before, and even warmer, as recently as 1,000 years ago. This demonstrates the wide range of natural variation in the climate, and today’s global temperatures are well within that range. At the party I mentioned earlier, I was amused to hear a friend say, “Ya’ know, Greenland isn’t supposed to be green”, and he meant it! He is apparently unaware that Greenland was given that name by Viking settlers around 1000 AD, who inhabited the island during a warm spell lasting several hundred years… until it got too cold!

Carbon Is Not Poison

The alarmists take the position that carbon emissions are unequivocally bad for people and the planet. They treat carbon as if it is the equivalent of poisonous air pollution. The popular press often illustrates carbon emissions as black smoke pouring from industrial smokestacks, but like oxygen, carbon dioxide is a colorless gas and a gas upon which life itself depends.

Our planet’s vegetation thrives on carbon dioxide, and increasing carbon concentrations are promoting a “greening” of the earth. Crop yields are increasing as a result; reforestation is proceeding as well. The enhanced vegetation provides an element of climate feedback against carbon “forcings” by serving as a carbon sink, absorbing increasing amounts of carbon and converting it to oxygen.

Matt Ridley has noted one of the worst consequences of the alarmists’ carbon panic and its influence on public policy: the vast misallocation of resources toward carbon reduction, much of it dedicated to subsidies for technologies that cannot pass economic muster. Consider that those resources could be devoted to many other worthwhile purposes, like bringing electric power to third-world families who otherwise must burn dung inside their huts for heat; for that matter, perhaps the resources could be left under the control of taxpayers who can put it to the uses they value most highly. The regulatory burdens imposed by these policies on carbon-intensive industries represent lost output that can’t ever be recouped, and all in the service of goals that are of questionable value. And of course, the anti-carbon efforts almost certainly reflect a diversion of resources to the detriment of more immediate environmental concerns, such as mitigating truly toxic industrial pollutants.

The priorities underlying the alarm over climate change are severely misguided. The public should demand better evidence than consistently erroneous model predictions and manipulated climate data. Unfortunately, a media eager for drama and statism is complicit in the misleading narrative.

FYI: The cartoon at the top of this post refers to the climate blog climateaudit.org. The site’s blogger Steve McIntyre did much to debunk the “hockey stick” depiction of global temperature history, though it seems to live on in the minds of climate alarmists. McIntyre appears to be on an extended hiatus from the blog.

Embracing the Robots

03 Friday Mar 2017

Posted by Nuetzel in Automation, Labor Markets, Technology

≈ 1 Comment

Tags

3-D Printing, Artificial Intelligence, Automation, David Henderson, Don Boudreaux, Great Stagnation, Herbert Simon, Human Augmentation, Industrial Revolution, Marginal Revolution, Mass Unemployment, Matt Ridley, Russ Roberts, Scarcity, Skills Gap, Transition Costs, Tyler Cowan, Wireless Internet

automation84s

Machines have always been regarded with suspicion as a potential threat to the livelihood of workers. That is still the case, despite the demonstrated power of machines make life easier and goods cheaper. Today, the automation of jobs in manufacturing and even service jobs has raised new alarm about the future of human labor, and the prospect of a broad deployment of artificial intelligence (AI) has made the situation seem much scarier. Even the technologists of Silicon Valley have taken a keen interest in promoting policies like the Universal Basic Income (UBI) to cushion the loss of jobs they expect their inventions to precipitate. The UBI is an idea discussed in last Sunday’s post on Sacred Cow Chips. In addition to the reasons for rejecting that policy cited in that post, however, we should question the premise that automation and AI are unambiguously job killing.

The same stories of future joblessness have been told for over two centuries, and they have been wrong every time. The vulnerability in our popular psyche with respect to automation is four-fold: 1) the belief that we compete with machines, rather than collaborate with them; 2) our perpetual inability to anticipate the new and unforeseeable opportunities that arise as technology is deployed; 3) our tendency to undervalue new technologies for the freedoms they create for higher-order pursuits; and 4) the heavy discount we apply to the ability of workers and markets to anticipate and adjust to changes in market conditions.

Despite the technological upheavals of the past, employment has not only risen over time, but real wages have as well. Matt Ridley writes of just how wrong the dire predictions of machine-for-human substitution have been. He also disputes the notion that “this time it’s different”:

“The argument that artificial intelligence will cause mass unemployment is as unpersuasive as the argument that threshing machines, machine tools, dishwashers or computers would cause mass unemployment. These technologies simply free people to do other things and fulfill other needs. And they make people more productive, which increases their ability to buy other forms of labour. ‘The bogeyman of automation consumes worrying capacity that should be saved for real problems,’ scoffed the economist Herbert Simon in the 1960s.“

As Ridley notes, the process of substituting capital for labor has been more or less continuous over the past 250 years, and there are now more jobs, and at far higher wages, than ever. Automation has generally involved replacement of strictly manual labor, but it has always required collaboration with human labor to one degree or another.

The tools and machines we use in performing all kinds of manual tasks become ever-more sophisticated, and while they change the human role in performing those tasks, the tasks themselves largely remain or are replaced by new, higher-order tasks. Will the combination of automation and AI change that? Will it make human labor obsolete? Call me an AI skeptic, but I do not believe it will have broad enough applicability to obviate a human role in the production of goods and services. We will perform tasks much better and faster, and AI will create new and more rewarding forms of human-machine collaboration.

Tyler Cowen believes that AI and  automation will bring powerful benefits in the long run, but he raises the specter of a transition to widespread automation involving a lengthy period of high unemployment and depressed wages. Cowen points to a 70-year period for England, beginning in 1760, covering the start of the industrial revolution. He reports one estimate that real wages rose just 22% during this transition, and that gains in real wages were not sustained until the 1830s. Evidently, Cowen views more recent automation of factories as another stage of the “great stagnation” phenomenon he has emphasized. Some commenters on Cowen’s blog, Marginal Revolution, insist that estimates of real wages from the early stages of the industrial revolution are basically junk. Others note that the population of England doubled during that period, which likely depressed wages.

David Henderson does not buy into Cowans’ pessimism about transition costs. For one thing, a longer perspective on the industrial revolution would undoubtedly show that average growth in the income of workers was dismal or nonexistent prior to 1760. Henderson also notes that Cowen hedges his description of the evidence of wage stagnation during that era. It should also be mentioned the share of the U.S. work force engaged in agricultural production was 40% in 1900, but is only 2% today, and the rapid transition away from farm jobs in the first half of the 20th century did not itself lead to mass unemployment nor declining wages (HT: Russ Roberts). Cowen cites more recent data on stagnant median income, but Henderson warns that even recent inflation adjustments are fraught with difficulties, that average household size has changed, and that immigration, by adding households and bringing labor market competition, has had at least some depressing effect on the U.S. median wage.

Even positive long-run effects and a smooth transition in the aggregate won’t matter much to any individual whose job is easily automated. There is no doubt that some individuals will fall on hard times, and finding new work might require a lengthy search, accepting lower pay, or retraining. Can something be done to ease the transition? This point is addressed by Don Boudreaux in another context in “Transition Problems and Costs“. Specifically, Boudreaux’s post is about transitions made necessary by changing patterns of international trade, but his points are relevant to this discussion. Most fundamentally, we should not assume that the state must have a role in easing those transitions. We don’t reflexively call for aid when workers of a particular firm lose their jobs because a competitor captures a greater share of the market, nor when consumers decide they don’t like their product. In the end, these are private problems that can and should be solved privately. However, the state certainly should take a role in improving the function of markets such that unemployed resources are absorbed more readily:

“Getting rid of, or at least reducing, occupational licensing will certainly help laid-off workers transition to new jobs. Ditto for reducing taxes, regulations, and zoning restrictions – many of which discourage entrepreneurs from starting new firms and from expanding existing ones. While much ‘worker transitioning’ involves workers moving to where jobs are, much of it also involves – and could involve even more – businesses and jobs moving to where available workers are.“

Boudreaux also notes that workers should never be treated as passive victims. They are quite capable of acting on their own behalf. They often act out of risk avoidance to save their funds against the advent of a job loss, invest in retraining, and seek out new opportunities. There is no question, however, that many workers will need new skills in an economy shaped by increasing automation and AI. This article discusses some private initiatives that can help close the so-called “skills gap”.

Crucially, government should not accelerate the process of automation beyond its natural pace. That means markets and prices must be allowed to play their natural role in directing resources to their highest-valued uses. Unfortunately, government often interferes with that process by imposing employment regulations and wage controls — i.e., the minimum wage. Increasingly, we are seeing that many jobs performed by low-skilled workers can be automated, and the expense of automation becomes more worthwhile as the cost of labor is inflated to artificial levels by government mandate. That point was emphasized in a 2015 post on Sacred Cow Chips entitled “Automate No Job Before Its Time“.

Another past post on Sacred Cow Chips called “Robots and Tradeoffs” covered several ways in which we will adjust to a more automated economy, none of which will require the intrusive hand of government. One certainty is that humans will always value human service, even when a robot is more efficient, so there will be always be opportunities for work. There will also be ways in which humans can compete with machines (or collaborate more effectively) via human augmentation. Moreover, we should not discount the potential for the ownership of machines to become more widely dispersed over time, mitigating the feared impact of automation on the distribution of income. The diffusion of specific technologies become more widespread as their costs decline. That phenomenon has unfolded rapidly with wireless technology, particularly the hardware and software necessary to make productive use of the wireless internet. The same is likely to occur with 3-D printing and other advances. For example, robots are increasingly entering consumer markets, and there is no reason to believe that the same downward cost pressures won’t allow them to be used in home production or small-scale business applications. The ability to leverage technology will require learning, but web-enabled instruction is becoming increasingly accessible as well.

Can the ownership of productive technologies become sufficiently widespread to assure a broad distribution of rewards? It’s possible that cost reductions will allow that to happen, but broadening the ownership of capital might require new saving constructs as well. That might involve cooperative ownership of capital by associations of private parties engaged in diverse lines of business. Stable family structures can also play a role in promoting saving.

It is often said that automation and AI will mean an end to scarcity. If that were the case, the implications for labor would be beside the point. Why would anyone care about jobs in a world without want? Of course, work might be done purely for pleasure, but that would make “labor” economically indistinguishable from leisure. Reaching that point would mean a prolonged process of falling prices, lifting real wages on a pace matching increases in productivity. But in a world without scarcity, prices must be zero, and that will never happen. Human wants are unlimited and resources are finite. We’ll use resources more productively, but we will always find new wants. And if prices are positive, including the cost of capital, it is certain that demands for labor will remain.

The Greening-Carbon Nexus

17 Saturday Dec 2016

Posted by Nuetzel in Environment, Global Warming

≈ 2 Comments

Tags

Atmospheric Carbon Concentration, Climate Change, Climate Consensus, David Henderson, Global Greening, global warming, Harrison H. Schmitt, Matt Ridley, Pollution, Rand Paul, Rodney W. Nichols, Roy Spencer, Thomas Malthus

carbon_sequestration

Satellite records show that our world is experiencing a remarkable “greening” in the 21st century, to the seeming chagrin of the environmental left. There is now more vegetation than two decades ago, and greener vegetation, across as much as 50% of the Earth’s vegetated surface area. That area is expanding as well, and the creeping greenery has improved soil moisture levels in some drylands. This bodes well for agricultural productivity, putting another nail in Malthus’ coffin. The satellite studies have concluded that most of the enhanced vegetation is attributable to greater concentration of CO2 in the atmosphere, as opposed to warming or other possible causes. An interesting feedback is that the enhanced vegetation increases natural absorption of CO2, providing an enhanced carbon sink. This, in turn, has caused a pause in the growth of atmospheric carbon cencentration.

The environmental left knows these developments tend to undermine their preferred narrative that human emissions of CO2 must be reduced — at any cost. In fact, already there are warnings that global greening will “outgrow its benefit” as the greater volume of plants begins to decay, releasing carbon. You just can’t make some people happy! But not all of the carbon release from plant decay adds to atmospheric carbon — some is soil-bound — so the greening should provide a fairly durable carbon sink.

Global greening was one of the major motifs in Matt Ridley’s 2016 Global Warming Policy Foundation Lecture. Ridley covered various evidence of greening, but he also discussed the failure of a large contingent of climate researchers to follow a legitimate scientific approach to the study of climate change. Instead, they have politicized their field of study, committing a few noteworthy frauds along the way:

“It is irresponsible not to challenge the evidence properly, especially if the policies pursued in its name are causing suffering. Increasingly, many people would like to outlaw, suppress, prosecute and censor all discussion of what they call ‘the science’ rather than engage in debate. …

No wonder that I talk frequently to scientists who are skeptical, but dare not say so openly. That is a ridiculous state of affairs. We’re told that it’s impertinent to question “the science” and that we must think as we are told. But arguments from authority are the refuge of priests. Thomas Henry Huxley put it this way: ‘The improver of natural knowledge absolutely refuses to acknowledge authority, as such. For him, scepticism is the highest of duties; blind faith the one unpardonable sin’. 

What keeps science honest, what stops it from succumbing entirely to confirmation bias, is that it is decentralized, allowing one lab to challenge another.“

It is all too true that policies advanced in the interests of curbing a slight warming trend cause real suffering, and the pain is heavily concentrated on the most impoverished. The presumed benefits of activist climate-change policies are speculative, at best. They have little chance of reversing atmospheric carbon concentration on their own.

Ridley makes note of the substantial evidence that sensitivity of the climate to airborne carbon concentration is low. This has become increasingly evident with the unfolding of a consistent record of over-forecasts of global temperatures by climate forcing models. Roy Spencer provides insights about these models in a recent discussion of global warming and “dodgy science” on his blog.

There is a widespread myth that 97 percent of climate scientists believe human activity is the main cause of global warming. In fact, that claim was based on a paper counting citations, not scientists; the methods used in the study and the citations themselves were also questionable. I have reviewed that evidence here on Sacred Cow Chips. David Henderson reviewed it here. A large number of studies find fault with so-called “consensus” pronouncements. They should always be viewed with suspicion.

There is also a lively debate underway over whether CO2 should be considered a pollutant! I exhale, therefore I pollute? To the extent that fecal matter is considered a pollutant, is it fair that to say that CO2 is, too? After all, both are anthropogenic. No, they are not even close in terms of an immediate threat to human health. As a philosophical matter, the idea that anything done by man is “unnatural” denies the fact that we are a very part of nature. Obviously, CO2 is not in the same class as pollutants like sulfur dioxide, ammonia, carbon monoxide or toxic metals. Today, these pollutants are very common in many parts of the world, and they are very threatening to human life. Effective mitigation technologies are available, but instead, in the developed West, we fixate on an increase in CO2 concentration of 100 parts per million over many decades, the climate implications of which are de minimis.

Rand Paul’s Facebook page has an ungated link to a WSJ.com commentary by Rodney W. Nichols and Harrison H. Schmitt on “The Phony War Against CO2”. Their commentary provokes questions as to the motives of the environmental left, and certain members of the research community, in shilling for the cause. That we would fight the greening of the globe, and the potential agricultural benefit it could bring, is bizarre. To devote enormous resources to an endeavor that is largely futile is a waste and a tragedy.

 

Big-Time Regulatory Rewards

26 Tuesday Jul 2016

Posted by Nuetzel in Big Government, Central Planning, Regulation

≈ Leave a comment

Tags

Cronyism, Daniel Mitchell, Glenn Reynolds, Guy Rolnik, Harvard Business Review, Industrial Policy, James Bessen, Matt Ridley, Mercatus Center, Regdata, regressivity

Government Control

Why does regulation of private industry so often inure to the benefit of the regulated at the expense of consumers? In the popular mind, at least, regulating powerful market players restrains “excessive” profits or ensures that their practices meet certain standards. More often than not, however, regulation empowers the strongest market players at the expense of the very competition that would otherwise restrain prices and provide innovative alternatives. The more complex the regulation, the more likely that will be the result. Smaller firms seldom have the wherewithal to deal with complicated regulatory compliance. Moreover, regulatory standards are promulgated by politicians, bureaucrats, and often the most powerful market players themselves. If ever a system was “rigged”, to quote a couple of well-known presidential candidates, it is the regulatory apparatus. Pro-regulation candidates might well have the voters’ best interests at heart, or maybe not, but the losers are usually consumers and the winners are usually the dominant firms in any regulated industry.

The extent to which our wanderings into the regulatory maze have rewarded crony capitalists — rent seekers — is bemoaned by Daniel Mitchell in “A Very Depressing Chart on Creeping Cronyism in the American Economy“. The chart shows that about 40% of the increase in U.S. corporate profits since 1970 was generated by rent-seeking efforts — not by activities that enhance productivity and output. The chart is taken from an article in the Harvard Business Review by James Bessen of Boston University called “Lobbyists Are Behind the Rise in Corporate Profits“. Here are a couple of choice quotes from the article:

“Lobbying and political campaign spending can result in favorable regulatory changes, and several studies find the returns to these investments can be quite large. For example, one study finds that for each dollar spent lobbying for a tax break, firms received returns in excess of $220. …regulations that impose costs might raise profits indirectly, since costs to incumbents are also entry barriers for prospective entrants. For example, one study found that pollution regulations served to reduce entry of new firms into some manufacturing industries.”

“This research supports the view that political rent seeking is responsible for a significant portion of the rise in profits [since 1970]. Firms influence the legislative and regulatory process and they engage in a wide range of activity to profit from regulatory changes, with significant success. …while political rent seeking is nothing new, the outsize effect of political rent seeking on profits and firm values is a recent development, largely occurring since 2000. Over the last 15 years, political campaign spending by firm PACs has increased more than thirtyfold and the Regdata index of regulation has increased by nearly 50% for public firms.“

A good explanation of Bessen’s findings is provided by Guy Rolnik, including an interview with Bessen. Law Professor Glenn Reynolds of the University of Tennessee put his finger on the same issue in an earlier article entitled “Why we still don’t have flying cars“. One can bicker about the relative merits of various regulations, but as Reynolds points out, the expansion of the administrative and regulatory state has led to a massive diversion of resources that is very much a detriment to the intended beneficiaries of regulation:

“… 1970 marks what scholars of administrative law (like me) call the ‘regulatory explosion.’ Although government expanded a lot during the New Deal under FDR, it wasn’t until 1970, under Richard Nixon, that we saw an explosion of new-type regulations that directly burdened people and progress: The Clean Air Act, the Clean Water Act, National Environmental Policy Act, the founding of Occupation Safety and Health Administration, the creation of the Environmental Protection Agency, etc. — all things that would have made the most hard-boiled New Dealer blanch.

Within a decade or so, Washington was transformed from a sleepy backwater (mocked by John F. Kennedy for its ‘Southern efficiency and Northern charm’) to a city full of fancy restaurants and expensive houses, a trend that has only continued in the decades since. The explosion of regulations led to an explosion of people to lobby the regulators, and lobbyists need nice restaurants and fancy houses.“

Matt Ridley hits on a related point in “Industrial Strategy Can Be Regressive“, meaning that government planning and industrial regulation have perverse effects on prices and economic growth that hit the poor the hardest. Ridley, who is British, discusses regressivity in the context of his country’s policy environment, but the lessons are general:

“The history of industrial strategies is littered with attempts to pick winners that ended up picking losers. Worse, it is government intervention, not laissez faire, that has done most to increase inequality and to entrench wealth and privilege. For example, the planning system restricts the supply of land for housebuilding, raising property prices to the enormous benefit of the haves (yes, that includes me) at the expense of the have-nots. … 

Why are salaries so high in financial services? Because there are huge barriers to entry erected by government, which hands incumbent firms enormous quasi-monopoly advantages and thereby shelters them from upstart competition. Why are cancer treatments so expensive? Because governments give monopolies called patents to the big firms that invent them. Why are lawyers so rich? Because there is a government-licensed cartel restricting the supply of them.“

Ridley’s spirited article gives emphasis to the fact that the government cannot plan the economy any more than it can plan the way our tastes and preferences will evolve and respond to price incentives; it cannot plan production any more than it can anticipate changes in resource availability; it cannot dictate technologies wisely any more than it can predict the innumerable innovations brought forth by private initiative and market needs; it almost never can regulate any better than the market can regulate itself! But government is quite capable of distorting prices, imposing artificial rules, picking suboptimal technologies, consuming resources, and rewarding cronies. One should never underestimate the potential for regulation, and government generally, to screw things up!

Anti-Glyphosate Goons and Gullibility

15 Sunday May 2016

Posted by Nuetzel in Agriculture, Regulation, Technology

≈ Leave a comment

Tags

Biology Fortified, Carcinogens, Christopher Portier, David Zaruk, Environmental Defense Fund, EPA, Farmer's Daughter, Glyphosate, IARC, International Agency for Research on Cancer, Julie Kelly, Kathryn Guyton, Matt Ridley, psuedoscience, Rational Optimist, Risk Monger, Roundup, Toxicity, WHO, World Health Organization

pseudociencia-a-saco

See the Postscript below.

A “roundup” of findings on the safety of glyphosate shows that the herbicide is very benign, highly unlikely to pose any real threat to humans, and far less toxic than many common household chemicals and even natural hazards in the environment. However, the debate over glyphosate is heavily politicized, as illustrated by the unsavory details surrounding a report issued last year by the International Agency for Research on Cancer (IARC), an arm of the World Health Organization (WHO). The IARC reclassified glyphosate as “probably carcinogenic to humans” based on a few cherry-picked, poorly-designed studies with weak statistical power. That finding is inconsistent with the vast preponderance research, which shows that glyphosate is not a significant threat to human health.

The Farmer’s Daughter provided a good summary of the issues shortly after the IARC’s ruling was announced last year. She offers the following quote from the U.S. Environmental Protection Agency (EPA):

“The U.S. EPA classified glyphosate as Group E, evidence of non-carcinogenicity in humans. The U.S. EPA does not consider glyphosate to be a human carcinogen based on studies of laboratory animals that did not produce compelling evidence of carcinogenicity.“

European regulators reached similar conclusions and are rather damning in their assessment of the IARC’s findings, though Brussels recently disregarded their findings and decided to ban the sale of glyphosate for gardening. In this post at Biology Fortified, Anastasia Bodnar discusses the low toxicity of glyphosate with links to several recent studies on its safety. And here is the Risk Monger blogs’s list of “ten reasons why glyphosate is the herbicide of the century“:

  1. Controlling invasive weeds leads to better agricultural yields
  2. Better yields = less land in production = more meadows and biodiversity
  3. Extremely low toxicity levels compared to (organic) alternatives
  4. Allows for no or low till farming – better for soil management
  5. Reduces CO2 emissions (compared to organic)
  6. Glyphosate saves lives
  7. It is much more affordable and effective than other options
  8. Glyphosate is off patent so no single company is profiting heavily from it
  9. Glyphosate-resistant crops allow for more ecological weed management practices
  10. There is overwhelming scientific evidence that glyphosate is safe for humans

How, then, did the IARC reach such a negative conclusion? Again from the Risk Monger, David Zaruk, the IARC hired just one external technical advisor, Christopher Portier, an activist previously employed by an NGO, the anti-pesticide Environmental Defense Fund (EDF). Portier has no technical background in toxicology, and the IARC apparently went to pains to avoid references to his affiliation with the EDF. Moreover, the IARC’s conclusion seems to have been preordained:

“The IARC study rejected thousands of documents on glyphosate that had industry involvement and based their decision on carcinogenicity on the basis of eight studies (rejecting a further six because they did not like their conclusions).“

The lead author of the report, Kathryn Guyton, gave a speech in 2014 in which she stated that herbicide studies slated for 2015 showed indications of a link to cancer. Just how did she know, so far ahead of time? And then there’s this revelation:

“According to the observer document, the glyphosate meeting started with the participants being told to rule out the possibility of classifying the substance as non-carcinogenic.“

Zaruk believes there is internal pressure for the IARC study to be retracted. The organization has suffered a great loss of credibility in the scientific community over the report. In addition, WHO has remained neutral thus far, but they are expected to address the issue this month.

Zaruk and Julie Kelly provide a more succinct summary of the issues in “The Facebook Age of Science at The World Health Organization” at National Review. The suggestion made in the title seems to be that WHO’s decision might be swayed by public pressure, measured by Facebook “likes” by the superstitious, such as unknowing David Wolfe devotees, rather than science:

“Environmentalists and organic companies tout phony studies claiming that glyphosate is found in everything from breast milk to bagels. … Meanwhile, farmers who use glyphosate to protect their crops and boost yields are caught in the crossfire. Even if glyphosate is banned, they will need to use another herbicide, probably more toxic, because the romantic notion of hand-weeding millions of acres of crops is promoted only by those who have never done it.“

I’ll keep using Monsanto’s Roundup, thanks! Or a competitive brand of glyphosate. To close, here’s a quote from Matt Ridley’s Rational Optimist blog on the embrace of pseudoscience at the IARC and elsewhere (including social media):

“Science, humanity’s greatest intellectual achievement, has always been vulnerable to infection by pseudoscience, which pretends to use the methods of science, but actually subverts them in pursuit of an obsession. Instead of evidence-based policymaking, pseudoscience specialises in policy-based evidence making. Today, this infection is spreading.“

Postscript: On May 16, WHO announced that glyphosate is “unlikely to cause cancer in people via dietary exposure.” Here is a Q&A from WHO regarding its assessment, explaining that it is based on risk as opposed to mere hazard, upon which the earlier IARC report was based. This is good news!

 

← Older posts
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • The Perils of Powell: Inflation, Illiquid Banks, Lonnng Lags
  • The Dreaded Social Security Salvage Job
  • Tariffs, Content Quotas, and What Passes for Patriotism
  • Carbon Credits and Green Bonds Are Largely Fake
  • The Wasteful Nature of Recycling Mandates

Archives

  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Ominous The Spirit
  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library

Blog at WordPress.com.

Ominous The Spirit

Ominous The Spirit is an artist that makes music, paints, and creates photography. He donates 100% of profits to charity.

Passive Income Kickstart

OnlyFinance.net

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

  • Follow Following
    • Sacred Cow Chips
    • Join 121 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...