• About

Sacred Cow Chips

Sacred Cow Chips

Tag Archives: Scientism

Price Controls: Political Gut Reaction, Gut Punch To Public

06 Thursday Jan 2022

Posted by pnoetx in Price Controls, Shortage

≈ Leave a comment

Tags

Artificial Tradeoffs, Big Meat, Big Oil, Black Markets, central planning, Excess Demand, Federal Reserve, Inflation, Isabella Weber, Joe Biden, Money Supply, Paul Krugman, Price Controls, Relative Prices, Scientism, Shortage, Unintended Consequences

In a gross failure of education or perhaps memory, politicians, policymakers, and certain academics seem blithely ignorant of things we’ve learned repeatedly. And of all the dumb ideas floated regarding our current bout with inflation, the notion of invoking price controls is near the top. But watch out, because the Biden Administration has already shifted from “inflation is transitory” to “it only hurts the rich” to “it’s fine because people just want to buy things”, and now “greedy businessmen are the culprits”. The latter falsehood is indeed the rationale for price controls put forward by a very confused economist at the University of Massachusetts-Amherst named Isabella Weber. (See this for an excerpt and a few immediate reactions.) She makes me grieve for my profession… even the frequently ditzy Paul Krugman called her out, though he softened his words after realizing he might have offended some of his partisan allies. Of course, the idea of price controls is just bad enough to gain favor with the lefty goofballs pulling Biden’s strings.

To understand the inflation process, it’s helpful to distinguish between two different dynamics:

1. When prices change we usually look for explanations in supply and demand conditions. We have supply constraints across a range of markets at the moment. There’s also a great deal to say about the ways in which government policy is hampering supplies of labor and energy, which are key inputs for just about everything. It’s fair to note here that, rather than price controls, we just might do better to ask government to get out of the way! In addition, however, consumer demand rebounded as the pandemic waned and waxed, and the federal government has been spending hand over fist, with generous distributions of cash with no strings attached. Thus, supply shortfalls and strong demand have combined to create price pressures across many markets.

2. Economy-wide, all dollar prices cannot rise continuously without an excess supply of a monetary asset. The Federal Reserve has discussed tapering its bond purchases in 2022 and its intention to raise overnight interest rates starting in the spring. It’s about time! The U.S. money supply ballooned during 2020 and its growth remains at a gallop. This has enabled the inflation we are experiencing today, and only recently have the markets begun to react as if the Fed means business.

Weber, our would-be price controller, exhibits a marked ignorance with respect to both aspects of price pressure: how markets work in the first instance, and how monetary profligacy lies at the root of broader inflation. Instead, she insists that prices are rising today because industrialists have simply decided to extract more profit! Poof! It’s as simple as that! Well what was holding those greedy bastards back all this time?

Everyone competes for scarce resources, so prices are bid upward when supplies are short, inputs more costly, or demand is outpacing supply for other reasons. Sure, sellers may earn a greater margin on sales under these circumstances. But the higher price accomplishes two important social objectives: efficient rationing of available quantities, and greater incentives to bring additional supplies to market.

So consider the outcome when government takes the advice of a Weber: producers are prohibited from adjusting price in response to excess demand. Shortages develop. Consumers might want more, but that’s either impossible or it simply costs more. Yet producers are prohibited from pricing commensurate with that cost. Other adjustments soon follow, such as changes in discounts, seller credit arrangements, and product quality. Furthermore, absent price adjustment, transaction costs become much more significant. Other resources are consumed in the mere process of allocating available quantities: time spent in queues, administering quotas, lotteries or other schemes, costly barter, and ultimately unsatisfied needs and wants, not to mention lots of anger and frustration. Lest anyone think this process is “fair”, keep in mind that it’s natural for these allocations to take a character that is worse than arbitrary. “Important people” will always have an advantage under these circumstances.

Regulatory and financial burdens are imposed on those who play by the rules, but not everyone does. Black market mechanisms come into play, including opportunities for illegal side payments, rewards for underworld activity, along with a general degradation in the rule of law.

Price controls also impose rigidity in relative prices that can be very costly for society. “Freezing” the value of one good in terms of others distorts the signals upon which efficient resource allocation depends. Tastes, circumstances, and production technology change, and flexible relative prices enable a smoother transitions between these states. And even while demand and/or input scarcity might increase in all markets, these dynamics are never uniform. Over time, imbalances always become much larger in some markets than others. Frozen relative prices allow these imbalances to persist.

For example, the true value of good A at the imposition of price controls might be two units of good B. Over time, the true value of A might grow to four units of good B, but the government insists that A must be traded for no more than the original two units of B. Good B thus becomes overvalued on account of government intervention. The market for good A, which should attract disproportionate investment and jobs, will instead languish under a freeze of relative prices. Good B will continue to absorb resources under the artificial tradeoff imposed by price controls. Society must then sacrifice the gains otherwise afforded by market dynamism.

The history of price controls is dismal (also see here). They artificially suppress measured inflation and impose great efficiency costs on the public. Meanwhile, price controls fail to address the underlying monetary excess.

Price controls are destructive when applied economy-wide, but also when governments attempt to apply them to markets selectively. Posturing about “strategic” use of price controls reveals the naïveté of those who believe government planners can resolve market dislocations better than market participants themselves. Indeed, the planners would do better to discover, and undo, the damage caused by so many ongoing regulatory interventions.

So beware Joe Biden’s bluster about “greedy producers” in certain markets, whether they be in “Big Meat”, or “Big Oil”. Price interventions in these markets are sure to bring you less meat, less oil, and quite possibly less of everything else. The unintended consequences of such government interventions aren’t difficult to foresee unless one is blinded with the scientism of central planning.

Hyperbolic Scenarios, Crude Climate Models, and Scientism

07 Sunday Nov 2021

Posted by pnoetx in Climate science, Global Warming

≈ 5 Comments

Tags

Carbon Efficiency, Carbon forcing, carbon Sensitivity, Cloud Feedback, COP26, G20, Global Temprature, IEA, Intergovernmental Panel on Climate Change, International Energy Agency, IPCC, Joe Biden, Joe Brandon, Judith Curry, Justin Ritchie, Net Zero Emissions, Nic Lewis, Precautionary Principle, Prince Charles, RCP8.5, rent seeking, Representative Concentration Pathway, Roger Pielke Jr., Scientism, United Nations

What we hear regarding the dangers of climate change is based on predictions of future atmospheric carbon concentrations and corresponding predictions of global temperatures. Those predictions are not “data” in the normal, positive sense. They do not represent “the way things are” or “the way things have been”, though one might hope the initial model conditions align with reality. Nor can the predictions be relied upon as “the way things will be”. Climate scientists normally report a range of outcomes produced by models, yet we usually hear only one type of consequence for humanity: catastrophe!

Models Are Not Reality

The kinds of climate models quoted by activists and by the UN’s Intergovernmental Panel on Climate Change (IPCC) have been around for decades. Known as “carbon forcing” models, they are highly simplified representations of the process determining global temperatures. The primary forecast inputs are atmospheric carbon concentrations over time, which again are themselves predictions.

It’s usually asserted that climate model outputs should guide policy, but we must ask: how much confidence can we have in the predictions to allow government to take coercive actions having immediate, negative impacts on human well being? What evidence can be marshaled to show prospective outcomes under proposed policies? And how well do these models fit the actual, historical data? That is, how well do model predictions track our historical experience, given the historical paths of inputs like carbon concentrations?

Faulty Inputs

The IPCC has been defining and updating sets of carbon scenarios since 1990. The scenarios outline the future paths of greenhouse gas emissions (and carbon forcings). They were originally based on economic and demographic modeling before an apparent “decision by committee” to maintain consistency with scenarios issued in the past. Roger Pielke Jr. and Justin Ritchie describe the evolution of this decision process, and they call for change:

“Our research (and that of several colleagues) indicates that the scenarios of greenhouse gas (GHG) emissions through the end of the twenty-first century are grounded in outdated portrayals of the recent past. Because climate models depend on these scenarios to project the future behavior of the climate, the outdated scenarios provide a misleading basis both for developing a scientific evidence base and for informing climate policy discussions. The continuing misuse of scenarios in climate research has become pervasive and consequential—so much so that we view it as one of the most significant failures of scientific integrity in the twenty-first century thus far. We need a course correction.”

One would certainly expect the predicted growth of atmospheric carbon to evolve over time. However, as Pielke and Ritchie note, the IPCC’s baseline carbon scenario today, known as RCP8.5 (“Representative Concentration Pathway”), is remarkably similar to the “business as usual” (BAU) scenario it first issued in 1990:

“The emissions scenarios the climate community is now using as baselines for climate models depend on portrayals of the present that are no longer true. And once the scenarios lost touch with reality, so did the climate, impact, and economic models that depend on them for their projections of the future. Yet these projections are a central part of the scientific basis upon which climate policymakers are now developing, debating, and adopting policies.”

The authors go on to discuss a few characteristics of the BAU scenario that today seem implausible, including:

“… RCP8.5 foresees carbon dioxide emissions growing rapidly to at least the year 2300 when Earth reaches more than 2,000 ppm of atmospheric carbon dioxide concentrations. But again, according to the IEA and other groups, fossil energy emissions have likely plateaued, and it is plausible to achieve net-zero emissions before the end of the century, if not much sooner.”

Pielke and Ritchie demonstrate that the IPCC’s baseline range of carbon emissions by 2045 is centered well above (actually double) the mid-range of scenarios developed by the International Energy Agency (IEA), and there is very little overlap between the two. However, global carbon emissions have been flat over the past decade. Even if we extrapolate the growth in atmospheric CO2 parts per million over the past 20 years, it would rise to less than 600 ppm by 2100, not 1,200 ppm. It’s true that a few countries (China comes to mind) continue to exploit less “carbon efficient” energy resources like coal, but the growth trend in concentrations is likely to continue to taper over time.

It therefore appears that the IPCC’s climate scenarios, which are used broadly as model inputs by the climate research community, are suspect. As one might suspect: garbage in, garbage out. But what about the climate models themselves?

Faulty Models

The model temperature predictions have been grossly in error. They have been and continue to be “too hot”. The chart at the top of this post is typical of the comparisons of model projections and actual temperatures. Before the year 2000, most of the temperature paths projected by the particular model charted above ran higher than actual temperatures. However, the trends subsequently diverged and the gap has become more extreme over the past two decades.

The problem is not merely one of faulty inputs. The models themselves are deeply flawed, as they fail to account adequately for natural forces that strongly influence our climate. It’s been clear for many years that the sun’s radiative energy has a massive impact on temperatures, and it is affected not only by the intensity of the solar cycle but also by cloud cover on Earth. Unfortunately, carbon forcing models do not agree on the role that increased clouds might have in amplifying warming. However, a reduction in cloud cover over the past 20 years, and a corresponding increase in radiative heat, can account for every bit of the warming experienced over that time.

This finding not only offers an alternative explanation for two decades of modest warming, it also strikes at the very heart of the presumed feedback mechanism usually assumed to amplify carbon-induced warming. The overall effect is summarized by the so-called carbon sensitivity, measured as the response of global temperature to a doubling of carbon concentration. The IPCC puts that sensitivity in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, as are those found by Frank Bosse reported here. The uncertainties surrounding the role of cloud cover and carbon sensitivities reveal that the outputs relied upon by climate alarmists are extreme model simulations, not the kind of reliable intelligence upon which drastic policy measures should be taken.

The constant anxiety issued from the Left on the issue of climate change, and not a little haranguing of the rest of us, is misplaced. The IPCC’s scenarios for the future paths of carbon concentration are outdated and seriously exaggerated, and they represent a breach of scientific protocol. Yet the scenarios are widely used as the basis of policy discussions at both the domestic and international levels. The climate models themselves embed questionable assumptions that create a bias toward calamitous outcomes.

Yet Drastic Action Is Urged

The UN’s 2021 climate conference, or COP26 (“Conference of the Parties …”) is taking place in Glasgow, Scotland this month. Like earlier international climate conferences, the hope is that dire forecasts will prompt broad agreement on goals and commitments, and that signatory countries will translate these into policy at the national level.

Things got off to a bad start when, before COP26 even began, the G20 nations failed to agree on a goal of “net-zero” carbon emissions by 2050. Another bad portent for the conference is that China and India, both big carbon emitters, will not attend, which must be tremendously disappointing to attendees. After all, COP26 has been billed by Prince Charles himself as “the last chance saloon, literally”, for saving the world from catastrophe. He said roughly the same thing before the Paris conference in 2014. And Joe Brandon … er, Biden, blurted some hyperbole of his own:

“Climate change is already ravaging the world. … It’s destroying people’s lives and livelihoods and doing it every single day. … It’s costing our nations trillions of dollars.”

All this is unadulterated hogwash. But it is the stuff upon which a crisis-hungry media feeds. This hucksterism is but one form of climate rent-seeking. Other forms are much more troubling: scary scenarios and model predictions serve the self-interest of regulators, grant-seeking researchers, interventionist politicians, and green investors who suckle at the public teat. It is a nightmare of scientism fed by the arrogance of self-interested social planners. The renewable energy technologies promoted by these investors, politicians, and planners are costly and land-intensive, providing only intermittent output (requiring backup fossil fuel capacity), and they have nasty environmental consequences of their own.

The precautionary principle is no excuse for the extreme policies advocated by alarmists. We already have economically viable “carbon efficient” and even zero-carbon energy alternatives, such as natural gas, modular nuclear power, and expanded opportunities for exploiting geothermal energy. This argues against premature deployment of wasteful renewables. The real crisis is the threat posed by the imposition of draconian green policies to our long-term prosperity, and especially to the world’s poor.

Portents of Harris-Biden Nation

22 Thursday Oct 2020

Posted by pnoetx in Politics

≈ Leave a comment

Tags

#MeToo, Anthony Weiner, Antifa, Barack Obama, Black Lives Matter, Court Packing, Critical Race Theory, Donald Trump, Green New Deal, Harvey Weinstein, Hunter Biden, Jeffrey Toobin, Joe Biden, Kamala Harris, Lockdowns, Marxism, Nancy Pelosi, Public Health, Scientism

Joe Biden is a weak figurehead, a one-time moderate faltering over a coalition of leftists. If you wonder why Nancy Pelosi floated legislation to establish a committee on “presidential capacity,” don’t think so much about her loathing for Donald Trump; think about poor Joe Biden. He might be shunted aside just as soon as the power grab isn’t too obvious. They know well how Barack Obama famously said, “Don’t underestimate Joe’s ability to f*ck things up.” But whether Joe Biden is in control of anything, think about who he stands with:

The Violent Left: Marxist Antifa and Marxist BLM; opposed to law and order; burning cities; spewing eliminationist rhetoric; hissing n*g**r at black cops;

Police Defunders: won’t acknowledge good policing is needed more than ever, especially in minority communities;

“Ministers of Truth”: social media platforms exerting control over what we say and what we see;

Re-Educators: democrats push for a “Truth and Reconciliation Commission” to address the “issue” of Trump supporters;

Critical Race Theorists: a Marxist front whereby every word and action is viewed in the context of racial bias and victimization; they want reparations; on your knees.

The Scientistic: who labor under the delusion that “science” should guide all administrative and political decisions. Or someone’s version of science. The very idea is antithetical to the scientific domain, which deals only with falsifiable hypotheses. Few matters of value can be addressed using the tools of science exclusively, nor can they address matters of ethics.

Fear Mongers: would rule by precaution; risks are always worth exaggerating to existential proportions;

Lockdown Tyrants: refuse to acknowledge the steep public health costs of lockdowns; stripping individual liberties indefinitely, including the right to contract, free practice of religion, and assembly;

Insurrectionists: who fabricated a Russian collusion hoax to subvert the 2016 election, and later to overthrow a sitting president;

Gun Confiscators: they will if we let them;

Abortionists: would use federal tax dollars to fund the murder of millions of babies late into pregnancy, primarily black babies;

Fluid-Genderists: insist that children should be encouraged to explore transgenderism;

Taxers: won’t stop with punitive taxes on the wealthy and employers; it’s just not easy to milk high earners in a way that’s sufficient to pay for the fiscal debauchery demanded by the Biden-Harris constituency. Joe says he will raise taxes by $3.4 trillion.

Spenders: $2 trillion of new federal education outlays, including universal pre-K and free community college; the Green New Deal (see below). After all, the democrats are the party that can’t tell the difference between a cut in spending and a reduction in spending growth. If you think Trump is a big spender, their plans are astonishing;

Green New Dealers: would spend trillions to restrict energy choices, transfer U.S. wealth overseas in the name of international carbon reduction, and reduce our standard of living;

Redistributionists: would tax job creators not simply for the benefit of supporting the needy, but for anyone regardless of need (see UBI); this extends to plans to bail out blue states and cities with insolvent public employee pension funds;

Interventionists: would regulate all phases of life, including straws, sugary drinks, and your fireplace; they will burden private initiative; create artificial, politically-favored winners skilled at manipulating regulatory rules for competitive reasons; and create losers who are typically too small to handle the burden;

Medical Socialists: will strip your private health insurance, dictate the care you may receive, fix prices, and regulate physicians and other providers. You’ll love the care abroad, if you can afford to get out when your sick.

Public School Monopolists: poorly performing, beholden to teachers’ unions, unresponsive to taxpayers and often parents; they would happily revoke school choice;

Federal Suburb Rezoners: demanding low-income housing in every community;

Court Packers: to destroy the independent judiciary;

Iran Apologists: give them cash on the tarmac, let them develop their “peaceful” nuclear program; alienate the rest of the Middle East;

Grifters: marketing their influence as public servants for private gain; never exclusive to one side of the aisle, but the Biden family has certainly traded on Joe to enrich themselves;

Smear Merchants: fabricated allegations against Brett Kavanaugh; impugned Amy Coney Barrett’s religious faith;

Perverts: Harvey Weinstein, Anthony Weiner, Jeffrey Toobin, Hunter Biden, and Bill Clinton, to name just a few; even Joe has his #MeToo accusers;

I could go on and on, but Harris-Biden voters should get a strong taste of their compatriots from the list above. It reflects the overriding prescriptive, bullying, and sometimes violent nature of the Left. They’d have you think all material goods can be free. Presto! They presume to have the knowledge and wisdom to plan the economy and your life better than you, Better than free markets and free people. What they’ll need is a lot of magic, or it won’t go well. You’ll get poverty and tears. I’m not sure Joe has the desire or the wherewithal to rein in his coalition of idiots.

Central Planning With AI Will Still Suck

23 Sunday Feb 2020

Posted by pnoetx in Artificial Intelligence, Central Planning, Free markets

≈ Leave a comment

Tags

Artificial Intelligence, central planning, Common Law, Data Science, Digital Socialism, Friedrich Hayek, Jesús Fernández-Villaverde, Machine Learning, Marginal Revolution, Property Rights, Robert Lucas, Roman Law, Scientism, The Invisible Hand, The Knowledge Problem, The Lucas Critique, Tyler Cowen

 

Artificial intelligence (AI) or machine learning (ML) will never make central economic planning a successful reality. Jesús Fernández-Villaverde of the University of Pennsylvania has written a strong disavowal of AI’s promise in central planning, and on the general difficulty of using ML to design social and economic policies. His paper, “Simple Rules for a Complex World with Artificial Intelligence“, was linked last week by Tyler Cowen at Marginal Revolution. Note that the author isn’t saying “digital socialism” won’t be attempted. Judging by the attention it’s getting, and given the widespread acceptance of the scientism of central planning, there is no question that future efforts to collectivize will involve “data science” to one degree or another. But Fernández-Villaverde, who is otherwise an expert and proponent of ML in certain applications, is simply saying it won’t work as a curative for the failings of central economic planning — that the “simple rules” of the market will aways produce superior social outcomes.

The connection between central planning and socialism should be obvious. Central planning implies control over the use of resources, and therefore ownership by a central authority, whether or not certain rents are paid as a buy-off to the erstwhile owners of those resources. By “digital socialism”, Fernández-Villaverde means the use of ML to perform the complex tasks of central planning. The hope among its cheerleaders is that adaptive algorithms can discern the optimal allocation of resources within some “big data” representation of resource availability and demands, and that this is possible on an ongoing, dynamic basis.

Fernández-Villaverde makes the case against this fantasy on three fronts or barriers to the use of AI in policy applications: data requirements; the endogeneity of expectations and behavior; and the knowledge problem.

The Data Problem: ML requires large data sets to do anything. And impossibly large data sets are required for ML to perform the task of planning economic activity, even for a small portion of the economy. Today, those data sets do not exist except in certain lines of business. Can they exist more generally, capturing the details of all economic transactions? Can the data remain current? Only at great expense, and ML must be trained to recognize whether data should be discarded as it becomes stale over time due to shifting demographics, tastes, technologies, and other changes in the social and physical environment. 

Policy Change Often Makes the Past Irrelevant: Planning algorithms are subject to the so-called Lucas Critique, a well known principle in macroeconomics named after Nobel Prize winner Robert Lucas. The idea is that policy decisions based on observed behavior will change expectations, prompting responses that differ from the earlier observations under the former policy regime. A classic case involves the historical tradeoff between inflation and unemployment. Can this tradeoff be exploited by policy? That is, can unemployment be reduced by a policy that increases the rate of inflation (by printing money at a faster rate)? In this case, the Lucas Critique is that once agents expect a higher rate of inflation, they are unlikely to confuse higher prices with a more profitable business environment, so higher employment will not be sustained. If ML is used to “plan” certain outcomes desired by some authority, based on past relationships and transactions, the Lucas Critique implies that things are unlikely to go as planned.  

The Knowledge Problem: Not only are impossibly large data sets required for economic planning with ML, as noted above. To achieve the success of markets in satisfying unlimited wants given scarce resources, the required information is impossible to collect or even to know. This is what Friedrich Hayek called the “knowledge problem”. Just imagine the difficulty of arranging a data feed on the shifting preferences of many different individuals across a huge number of products,  services and they way preference orderings will change across the range of possible prices. The data must have immediacy, not simply a historical record. Add to this the required information on shifting supplies and opportunity costs of resources needed to produce those things. And the detailed technological relationships between production inputs and outputs, including time requirements, and the dynamics of investment in future productive capacity. And don’t forget to consider the variety of risks agents face, their degree of risk aversion, and the ways in which risks can be mitigated or hedged. Many of these things are simply unknowable to a central authority. The information is hopelessly dispersed. The task of collecting even the knowable pieces is massive beyond comprehension.

The market system, however, is able to process all of this information in real time, the knowable and the unknowable, in ways that balance preferences with the true scarcity of resources. No one actor or authority need know it all. It is the invisible hand. Among many other things, it ensures the deployment of ML only where it makes economic sense. Here is Fernández-Villaverde:

“The only reliable method we have found to aggregate those preferences, abilities, and efforts is the market because it aligns, through the price system, incentives with information revelation. The method is not perfect, and the outcomes that come from it are often unsatisfactory. Nevertheless, like democracy, all the other alternatives, including ‘digital socialism,’ are worse.”

Later, he says:

“… markets work when we implement simple rules, such as first possession, voluntary exchange, and pacta sunt servanda. This result is not a surprise. We did not come up with these simple rules thanks to an enlightened legislator (or nowadays, a blue-ribbon committee of academics ‘with a plan’). … The simple rules were the product of an evolutionary process. Roman law, the Common law, and Lex mercatoria were bodies of norms that appeared over centuries thanks to the decisions of thousands and thousands of agents.” 

These simple rules represent good private governance. Beyond reputational enforcement, the rules require only trust in the system of property rights and a private or public judicial authority. Successfully replacing private arrangements in favor of a central plan, however intricately calculated via ML, will remain a pipe dream. At best, it would suspend many economic relationships in amber, foregoing the rational adjustments private agents would make as conditions change. And ultimately, the relationships and activities that planning would sanction would be shaped by political whim. It’s a monstrous thing to contemplate — both fruitless and authoritarian.

You’re Welcome: Charitable Gifts Prompt Statist Ire

14 Friday Dec 2018

Posted by pnoetx in Central Planning, Charity, Uncategorized

≈ 1 Comment

Tags

Amazon, American Institute for Economic Research, central planning, Charity, Cloe Anagnos, Day 1 Fund, Doug Bandow, Forced Charity, Gaby Del Valle, Homelessness, Jeff Bezos, Redistribution, Russ Roberts, Scientism, Seattle Employment Tax, War on Charity

Charitable acts are sometimes motivated by a desire to cultivate a favorable reputation, or even to project intelligence. Perhaps certain charitable acts are motivated by guilt of one kind or another. Tax deduction are nice, too. But sometimes a charitable gift is prompted by no more than a desire to help others less fortunate. It’s likely a combination of motives in many cases, but to gainsay the purity of anyone’s charitable motives is rather unseemly. Yet Gaby Del Valle does just that in Vox, casting a skeptical eye at Jeff Bezos’ efforts to help the homeless through his Day 1 Fund.

“Last week, Amazon founder and CEO Jeff Bezos announced that he and his wife, MacKenzie Bezos, were donating $97.5 million to 24 organizations that provide homeless services across the country. The donation is part of Bezos’s $2 billion ‘Day 1 Fund, a philanthropic endeavor … that, according to Bezos, focuses on establishing ‘a network of new, non-profit, tier-one preschools in low-income communities’ and funding existing nonprofits that provide homeless services.”

Del Valle says Bezos deserves little credit for his big gift for several reasons. First, Amazon very publicly opposed a recent initiative for a $275 per employee tax on large employers in Seattle. The proceeds would have been used to fund public programs for the homeless. This allegation suggests that Bezos feels guilty, or that the gift is a cynical attempt to buy-off critics. That might have an element of truth, but the tax was well worthy of opposition on economic grounds — almost as if it was designed to stunt employment and economic growth in the city.

Second, because Amazon has been an engine of growth for Seattle, Del Valle intimates that the company and other large employers are responsible for the city’s high cost of housing and therefore homelessness. Of course, growth in a region’s economy is likely to lead to higher housing prices if the supply of housing does not keep pace, but forsaking economic growth is not a solution. Furthermore, every large city in the country suffers from some degree of homelessness. And not all of those homeless individuals have been “displaced”, as Del Valle would have it. Some have relocated voluntarily without any guarantee or even desire for employment. As for the housing stock, government environmental regulations, zoning policies and rent control (in some markets) restrains expansion, leading to higher costs.

Finally, Del Valle implies that private efforts to help the homeless are somehow inferior to “leadership by elected officials”. Further, she seems to regard these charitable acts as threatening to “public” objectives and government control. At least she doesn’t disguise her authoritarian impulses. Del Valle also quotes a vague allegation that one of the charities beholden to Amazon is less than a paragon of charitable virtue. Well, I have heard similar allegations that government isn’t celebrated for rectitude in fulfilling its duties. Like all statists, Del Valle imagines that government technocrats possess the best vision of how to design aid programs. That attitude is an extension of the scientism and delusions of efficacy typical of central planners. Anyone with the slightest awareness of the government’s poor track record in low-income housing would approach such a question with trepidation. In contrast, private efforts often serve as laboratories in which to test innovative programs that can later be adopted on a broader scale.

While selfishness might motivate private acts of charity in some cases, only voluntary, private charity can ever qualify as real charity. Government benefits for the homeless are funded by taxes, which are compulsory. Such public programs might be justifiable as an extension of social insurance, but it is not charity in any pure sense; neither are it advocates engaged in promoting real charity, despite their conveniently moralistic positioning. And unlike private charity, government redistribution programs can be restrained only through a political process in which substantial payers are a distinct minority of the voting population.

Public aid and private charity have worked alongside each other for many years in the U.S. According to Russ Roberts, private giving to the poor began to be “crowded-out” during the Great Depression by a dramatic increase in public assistance programs. (Also see Doug Bandow’s “War On Charity“.) It’s certainly more difficult to make a case for gifts to the poor when donors are taxed by the government in order to redistribute income.

The statist war on private charity can take other forms. The regulatory apparatus can crowd-out private efforts to extend a helping hand. Chloe Anagnos of the American Institute for Economic Research (AIER) writes of a charity in Kansas City that wanted to provide home-cooked soup to the homeless, but health officials intervened, pouring bleach into the soup. I am aware of similar but less drastic actions in St. Louis, where organizations attempting to hand-out sandwiches to the poor were recently prohibited by health authorities.

Private charity has drawn criticism because its source has driven economic growth, its source has opposed policies that stunt comic growth, and because it might interfere with the remote possibility that government would do it better. But private charity plays a critical role in meeting the needs of the disadvantaged, whether as a substitute for public aid where it falls short, or as a supplement. It can also play a productive role in identifying the most effective designs for aid programs. Of course, there are corrupt organizations and individuals purporting to do charitable work, which argues for a degree of public supervision over private charities. But unfortunately, common sense is too often lost to overzealous enforcement. In general, the public sector should not stand in the way of private charities and charitable acts, but real generosity has little value to those who press for domination by the state.

Authoritarian Designs

31 Sunday Jan 2016

Posted by pnoetx in Progressivism, racism, Uncategorized

≈ Leave a comment

Tags

Bernie Sanders, Child Quotas, CRISPR, Davis Bacon Act, Eugenics, Friedrich Hayek, John Stewart Mill, Jonah Goldberg, Kevin Drum, Minimum Wage, Mother Jones, Obamacare Effectiveness Research, Progressivism, racism, Scientism, Sterilization, Tyler Cowen

eugenics certificate

Why condemn today’s progressives for their movement’s early endorsement of eugenics? Kevin Drum at Mother Jones thinks this old association is now irrelevant. He furthermore believes that eugenics is not an important issue in the modern world. Drum’s remarks were prompted by Jonah Goldberg’s review of Illiberal Reformers, a book by Thomas Leonard on racism and eugenicism in the American economics profession in the late 19th century. Tyler Cowen begs to differ with Drum on both counts, but for reasons that might not have been obvious to Drum. Eugenics is not a bygone, and its association with progressivism is a reflection of the movement’s broader philosophy of individual subservience to the state and, I might add, the scientism that continues to run rampant among progressives.

Cowen cites John Stewart Mill, one of the great social thinkers of the 19th century, who was an advocate for individual liberty and a harsh critic of eugenics. Here is a great paragraph from Cowen:

“The claim is not that current Progressives are evil or racist, but rather they still don’t have nearly enough Mill in their thought, and not nearly enough emphasis on individual liberty. Their continuing choice of label seems to indicate they are not much bothered by that, or maybe not even fully aware of that. They probably admire Mill’s more practical reform progressivism quite strongly, or would if they gave it more thought, but they don’t seem to relate to the broader philosophy of individual liberty as it surfaced in the philosophy of Mill and others. That’s a big, big drawback and the longer history of Progressivism and eugenics is perhaps the simplest and most vivid way to illuminate the point. This is one reason why the commitment of the current Left to free speech just isn’t very strong.“

Eugenics is not confined to the distant past, as Cowen notes, citing more recent “progressive” sterilization programs in Sweden and Canada, as well as the potential use of DNA technologies like CRISPR in “designing” offspring. That’s eugenics. So is the child quota system practiced in China, sex-selective abortion, and the easy acceptance of aborting fetuses with congenital disorders. Arguably, Obamacare “effectiveness research” guidelines cut close to eugenicism by proscribing certain treatments to individuals based upon insufficient “average benefit”, which depends upon age, disability, and stage of illness. Obamacare authorizes that the guidelines may ultimately depend on gender, race and ethnicity. All of these examples illustrate the potential for eugenics to be practiced on a broader scale and in ways that could trample individual rights.

Jonah Goldberg also responded to Drum in “On Eugenics and White Privilege“. (You have to scroll way down at the link to find the section with that title.) Goldberg’s most interesting points relate to the racism inherent in the minimum wage and the Davis-Bacon Act, two sacred cows of progressivism with the same original intent as eugenics: to weed out “undesirables”, either from the population or from competing in labor markets. It speaks volumes that today’s progressives deny the ugly economic effects of these policies on low-skilled workers, yet their forebears were counting on those effects.

Scientism is a term invoked by Friedrich Hayek to describe the progressive fallacy that science and planning can be used by the state to optimize the course of human affairs. However, the state can never command all the information necessary to do so, particularly in light of the dynamism of information relating to scarcity and preferences; government has trouble enough carrying out plans that merely match the static preferences of certain authorities. Historically, such attempts at planning have created multiple layers of tragedy, as individual freedoms and material well-being were eroded. Someone should tell Bernie Sanders!

Eugenics fit nicely into the early progressive view, flattering its theorists with the notion that the human race could be made… well, more like them! Fortunately, eugenics earned its deservedly bad name, but it continues to exist in somewhat more subtle forms today, and it could take more horrific forms in the future.

Two earlier posts on Sacred Cow Chips dealt at least in part with eugenics: “Child Quotas: Family as a Grant of Privilege“, and “Would Heterosexuals Select For Gay Genes?“.

 

.

Horizons Lost To Coercive Intervention

27 Wednesday Jan 2016

Posted by pnoetx in Human Welfare, Price Controls, Regulation

≈ Leave a comment

Tags

Allocation of Resources, Don Boudreaux, Foregone Alternatives, Frederic Bastiat, Luddites, Minimum Wage, Opportunity Costs, Price Ceilings, Price Controls, Price floors, Rent Control, Scientism, Unintended Consequences, What is Not Seen

ceiling prices

Every action has a cost. When you’re on the hook, major decisions are obviously worth pondering. But major societal decisions are often made by agents who are not on the hook, with little if any accountability for long-term consequences. They have every incentive to discount potential downside effects, especially in the distant future. Following Frederic Bastiat, Don Boudreaux writes of three levels of “What Is Not Seen” as a consequence of human decisions, which I summarize here:

  1. Immediate foregone alternatives: Possession, use and enjoyment of X is not seen if you buy Y.
  2. Resources not directed to foregone alternatives: The reduction in X inventory is not seen, compensating production of X is not seen, and extra worker hours, capital use and flow of raw materials needed for X production are not seen.
  3. The future implied by foregone alternatives: Future impacts can take many forms. X might have been a safer or healthier alternative, but those benefits are unseen. X might have been lower quality, so the potential frustration and repairs are unseen. X might have been less expensive, but the future benefits of the money saved are unseen. All of these “unseens” have implications for the future world experienced by the decision-maker and others.

These effects take on much more significance in multiples, but (2) and (3) constitute extended unseen implications for society at large. In multiples, the lost (unseen) X production and X labor-hours, capital and raw materials are more obvious to the losers in the X industry than the winners in the Y industry, but they matter. In the future, no vibrant X industry will not be seen; the resources diverted to meet Y demand won’t be seen at new or even old X factories. X might well vanish, leaving only nontransformable detritus as a token of its existence.

Changes in private preferences or in production technologies create waves in the course of the “seen” reality and the “unseen” world foregone. Those differences are caused by voluntary, private choice, so gains are expected to outweigh losses relative to the “road not traveled”. That’s not a given, however, when decisions are imposed by external authorities with incentives unaligned with those in their thrall. For that reason, awareness of the unseen is of great importance in policy analysis, which is really Boudreaux’s point. Here is an extreme example he offers in addressing the far-reaching implications of government intrusions:

“Suppose that Uncle Sam in the early 20th century had, with a hypothetical Ludd Act, effectively prohibited the electrification of American farms, businesses, and homes. That such a policy would have had a large not-seen element is evident even to fans of Bernie Sanders. But the details of this not-seen element would have been impossible today even to guess at with any reliability. Attempting to quantify it econometrically would be an exercise in utter futility. No one in a 2015 America that had never been electrified could guess with any sense what the Ludd Act had cost Americans (and non-Americans as well). The not-seen would, in such a case, loom so large and be so disconnected to any known reality that it would be completely mysterious.“

Price regulation provides more familiar examples. Rent controls intended to “protect” the public from landlords have enormous “unintended” consequences. Like any price regulation, rent controls stifle exchange, reducing the supply and quality of housing. Renters are given an incentive to remain in their units, and property owners have little incentive to maintain or upgrade their properties. Deterioration is inevitable, and ultimately displacement of renters. The unseen, lost world would have included more housing, better housing, more stable neighborhoods and probably less crime.

A price floor covered by Boudreaux is the minimum wage. The fully predictable but unintended consequences include immediate losses in some combination of jobs, hours, benefits, and working conditions by the least-skilled class of workers. Higher paid workers feel the impact too, as they are asked to perform more (and less complex) tasks or are victimized by more widespread substitution of capital for labor. Consumers also feel some of the pain in higher prices. The net effect is a reduction in mutually beneficial trade that continues and may compound with time:

“As the time span over which obstructions to certain economic exchanges lengthens, the exchanges that would have, but didn’t, take place accumulate. The businesses that would have been created absent a minimum wage – but which, because of the minimum wage, are never created – grow in number and variety. The instances of on-the-job worker training that would have occurred – but, because of the minimum wage, didn’t occur – stack up increasingly over time.“

Regulation and taxation of all forms have such destructive consequences, but policy makers seldom place a heavy weight on the unobserved counterfactual. Boudreaux emphasizes the futility of quantifying the “unseen” effects these policies:

“… those who insist that only that which can be measured and quantified with numerical data is real must deny, as a matter of their crabbed and blinding scientism, that such long-term effects … are not only not-seen but also, because they are not-seen, not real.“

The trade and welfare losses of coercive interventions of all types are not hypothetical. They are as real as the losses caused by destruction of property by vandals. Never again can the owners enjoy the property as they once had. Future pleasures are lost and cannot be observed or measured objectively. Even worse, when government disrupts economic activity, the cumulative losses condemn the public to a backward world that they will find difficult to recognize as such.

 

A Cooked-Up Climate Consensus

14 Tuesday Jul 2015

Posted by pnoetx in Global Warming

≈ 3 Comments

Tags

97% Consensus, AGW, Anthropomorphic Global Warming, Climate Change, Climate change consensus, Climate fraud, Ian Plimer, John Cook, Matt Ridley, Peer Review Process, Richard Tol, Scientism, University of Queensland

Settled-Science

Consensus: the world is flat; the science is settled. Consensus: the earth is at the center of the universe; the science is settled. Consensus: bloodletting can cure diseases; the science is settled. Did these ideas truly represent scientific consensus? They probably thought so at the time, but it’s more likely that they derived from long- and widely-held assumptions that had never been tested adequately via scientific methods. It might have been difficult, if not impossible, to test those propositions using the methods available at the time. There are certainly other examples of  “settled science” that were later revised, such as certain aspects of Newtonian physics.

The so-called “consensus” on climate change is similar to the first few “scientistic” assertions above, except that it’s a much less honest mistake. The most prominent claim about it is that 97% of climate scientists agree that humans have contributed to global warming. That is incorrect in several ways. Its genesis is a 2013 paper by John Cook of the University of Queensland. Richard Tol of the University of Sussex examines the facts surrounding the Cook paper in “Global warming consensus claim does not stand up“. The claim itself is a misrepresentation of Cook’s findings, according to Tol:

“The 97% refers to the number of papers, rather than the number of scientists. The alleged consensus is about any human role in climate change, rather than a dominant role….“

It is well known that the peer review process in the climate research community was fundamentally corrupt during the period covered by Cook’s examination of the literature. Papers submitted to academic journals by climate “dissenters” were often shut out, which would have biased Cook’s findings even if his review had been conducted honestly. Tol goes on to note the distortions introduced by Cook’s research, including a non-representative sample of papers:

“The sample was padded with irrelevant papers. An article about TV coverage on global warming was taken as evidence for global warming. In fact, about three-quarters of the papers counted as endorsements had nothing to say about the subject matter.“

It gets even worse:

“Cook enlisted a small group of environmental activists to rate the claims made by the selected papers. Cook claims that the ratings were done independently, but the raters freely discussed their work. There are systematic differences between the raters. Reading the same abstracts, the raters reached remarkably different conclusions – and some raters all too often erred in the same direction. Cook’s hand-picked raters disagreed what a paper was about 33% of the time. In 63% of cases, they disagreed about the message of a paper with the authors of that paper.“

On top of all that, Cook was uncooperative when asked to make his data available to other researchers. Apparently a hacker obtained the data, which revealed a highly questionable data collection process (and that Cook had lied regarding the existence of time stamps on the surveys):

“After collecting data for 8 weeks, there were 4 weeks of data analysis, followed by 3 more weeks of data collection. The same people collected and analysed the data. After more analysis, the paper classification scheme was changed and yet more data collected.“

In short, the Cook research upon which the 97% claim is based is trash. There are a number of points upon which climate researchers can largely agree in principle, including the fact that greenhouse gases would warm the planet, but only if ceteris paribus is invoked. There are many feedback effects and confounding influences that change the relationship, and the actual time span of data that can be brought to bear on the issue is strikingly short to justify bold conclusions. Unfortunately, the research environment is so politicized that even the data itself is subject to manipulation. Astonishingly, many assertions about the actual climate are, in fact, based on model output, not actual data!

There is strong disagreement at the highest levels of the scientific community regarding the balance of the evidence on climate change and whether it justifies radical policy change. Matt Ridley examines this issue in “The Climate Wars’ Damage To Science“:

“Today’s climate science, as Ian Plimer points out in his chapter in The Facts, is based on a ‘pre-ordained conclusion, huge bodies of evidence are ignored and analytical procedures are treated as evidence’. Funds are not available to investigate alternative theories. Those who express even the mildest doubts about dangerous climate change are ostracised, accused of being in the pay of fossil-fuel interests or starved of funds; those who take money from green pressure groups and make wildly exaggerated statements are showered with rewards and treated by the media as neutral.“

Ridley goes on to recount the litany of scandals that have erupted within the climate establishment over the past few years. It is well worth reading, but ultimately these developments can’t help but damage science, its reputation with the public, and its usefulness to mankind.

What a Joy To Be a Social Scientist With ESSP

09 Wednesday Jul 2014

Posted by pnoetx in Uncategorized

≈ Leave a comment

Tags

Arnold Kling, central planning, Scientism, Social Science

The world of social phenomena is so complex that we should be guarded in accepting appeals to scientism. As Arnold Kling points out, social scientism is insidious because it may appear to comport with “common sense,” yet this frequently involves a fallacy of division. Kling believes we should do our best to exercise “ESSP,” or Epistemological Skepticism about Social Phenomena. The egos of central planners are fed by social scientism of the type described by Kling, but their promises regularly fail to pan out, leading to a kind of societal senescence. But if we all rev up our ESSP, and keep are meddling hands off, we’re likely to enjoy a more creative and prosperous society. Let freedom ring!

Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Rejecting Fossil Fuels at Our Great Peril
  • The Fed’s Balance Sheet: What’s the Big Deal?
  • Collectivism Is Not the “Natural” State
  • Social Insurance, Trust Fund Runoff, and Federal Debt
  • Critical Gender Theory and Trends in Gender Identity

Archives

  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • CBS St. Louis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • Public Secrets
  • A Force for Good
  • ARLIN REPORT...................walking this path together
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic

Blog at WordPress.com.

Passive Income Kickstart

OnlyFinance.net

Financial Matters!

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

CBS St. Louis

News, Sports, Weather, Traffic and St. Louis' Top Spots

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

Public Secrets

A 93% peaceful blog

A Force for Good

How economics, morality, and markets combine

ARLIN REPORT...................walking this path together

PERSPECTIVE FROM AN AGING SENIOR CITIZEN

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

  • Follow Following
    • Sacred Cow Chips
    • Join 120 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...