• About

Sacred Cow Chips

Sacred Cow Chips

Monthly Archives: April 2017

What Part of “Free Speech” Did You Not Understand?

27 Thursday Apr 2017

Posted by Nuetzel in Censorship, Free Speech

≈ Leave a comment

Tags

Antifa, Censorship, Eugene Volokh, Fighting Words, First Amendment, Free Speech, Harry A. Blackmun, Hate Speech, Imminent Lawless Action, John Daniel Davidson, New York University, prior restraint doctrine, Reason.com, Robby Soave, The Federalist, Ulrich Baer

The left has adopted an absurdly expansive definition of “hate speech”, and they’d like you to believe that “hate speech” is unconstitutional. Their objective is to establish a platform from which they can ostracize and ultimately censor political opponents on a variety of policy issues, mixed with the pretense of a moral high ground. The constitutional claim is legal nonsense, of course. To be fair, the moral claim may depend on the issue.

John Daniel Davidson writes in The Federalist of the distinction between protected and unprotected speech in constitutional law. The primary exception to protected speech has to do with the use of “fighting words”. Davidson describes one Supreme Court interpretation of fighting words as “a face-to-face insult directed at a specific person for the purpose of provoking a fight.” Obviously threats would fall into the same category, but only to the extent that they imply “imminent lawless action”, according to a major precedent. As such, there is a distinction between fighting words versus speech that is critical, discriminatory, or even hateful, all of which are protected.

Hate speech, on the other hand, has no accepted legal definition. In law, it has not been specifically linked to speech offensive to protected groups under employment, fair housing, hate crime or any other legislation. If we are to accept the parlance of the left, it seems to cover almost anything over which one might take offense. However, unless it qualifies as fighting words, it is protected speech.

The amorphous character of hate speech, as a concept, makes it an ideal vehicle for censoring political opponents, and that makes it extremely dangerous to the workings of a free society. Any issue of public concern has more than one side, and any policy solution will usually create winners and losers. Sometimes the alleged winners and losers are merely ostensible winners and losers, as dynamic policy effects or “unexpected consequences” often change the outcomes. Advocacy for one solution or another seldom qualifies as hate toward those presumed to be losers by one side in a debate, let alone a threat of violence. Yet we often hear that harm is done by the mere expression of opinion. Here is Davidson:

“By hate speech, they mean ideas and opinions that run afoul of progressive pieties. Do you believe abortion is the taking of human life? That’s hate speech. Think transgenderism is a form of mental illness? Hate speech. Concerned about illegal immigration? Believe in the right to bear arms? Support President Donald Trump? All hate speech.“

Do you support the minimum wage? Do you oppose national reparation payments to African Americans? Do you support health care reform? Welfare reform? Rollbacks in certain environmental regulations? Smaller government? You just might be a hater, according to this way of thinking!

The following statement appears in a recent proposal on free speech. The proposal was recommended as policy by an ad hoc committee created by the administration of a state university:

“… Nor does freedom of expression create a privilege to engage in discrimination involving unwelcome verbal, written, or physical conduct directed at a particular individual or group of individuals on the basis of actual or perceived status, or affiliation within a protected status, and so severe or pervasive that it creates an intimidating or hostile environment that interferes with an individual’s employment, education, academic environment, or participation in the University’s programs or activities.“

This is an obvious departure from the constitutional meaning of free expression or any legal precedent.

And here is Ulrich Baer, who is New York University‘s vice provost for faculty, arts, humanities, and diversity (and professor of comparative literature), in an opinion piece this week in the New York Times:

“The recent student demonstrations [against certain visiting speakers] should be understood as an attempt to ensure the conditions of free speech for a greater group of people, rather than censorship. … Universities invite speakers not chiefly to present otherwise unavailable discoveries, but to present to the public views they have presented elsewhere. When those views invalidate the humanity of some people, they restrict speech as a public good.  …

The idea of freedom of speech does not mean a blanket permission to say anything anybody thinks. It means balancing the inherent value of a given view with the obligation to ensure that other members of a given community can participate in discourse as fully recognized members of that community.“

How’s that for logical contortion? Silencing speakers is an effort to protect free speech! As noted by Robby Soave in on Reason.com, “... free speech is not a public good. It is an individual right.” This cannot be compromised by the left’s endlessly flexible conceptualization of “hate speech”, which can mean almost any opinion with which they disagree. Likewise, to “invalidate the humanity of some people” is a dangerously subjective standard. Mr. Baer is incorrect in his assertion that speakers must balance the “inherent” value of their views with an obligation to be “inclusive”. The only obligation is not to threaten or incite “imminent lawless action”. Otherwise, freedom of speech is a natural and constitutionally unfettered right to express oneself. Nothing could be more empowering!

Note that the constitution specifically prohibits the government from interfering with free speech. That includes any public institution such as state universities. Private parties, however, are free to restrict speech on their own property or platform. For example, a private college can legally restrict speech on its property and within its facilities. The owner of a social media platform can legally restrict the speech used there as well.

Howard Dean, a prominent if somewhat hapless member of the democrat establishment, recently tweeted this bit of misinformation: “Hate speech is not protected by the first amendment.” To this, Dean later added some mischaracterizations of Supreme Court decisions, prompting legal scholar Eugene Volokh to explain the facts. Volokh cites a number of decisions upholding a liberal view of free speech rights (and I do not use the word liberal lightly). Volokh also cites the “prior restraint doctrine”:

“The government generally may not exclude speakers — even in government-owned ‘limited public forums’ — because of a concern that the speakers might violate the rules if they spoke.“

If a speaker violates the law by engaging in threats or inciting violence, it is up to law enforcement to step in, ex post, just as they should when antifa protestors show their fascist colors through violent efforts to silence speakers. Volokh quotes from an opinion written by Supreme Court Justice Harry A. Backmun:

“… a free society prefers to punish the few who abuse rights of speech after they break the law than to throttle them and all others beforehand. It is always difficult to know in advance what an individual will say, and the line between legitimate and illegitimate speech is often so finely drawn that the risks of freewheeling censorship are formidable.”

Imprecision and Unsettled Science

21 Friday Apr 2017

Posted by Nuetzel in Global Warming, Propaganda

≈ 1 Comment

Tags

Abatement Cost, Carbon Abatement, Carbon Forcings, Carbon Limits, Charles Hooper, Climate models, Cloud Formation, Confidence Interval, David Henderson, Earth Day, Measurement Error, Natural Climate Variation, Solar Forcings, Statistical Precision, Surface Temperatures, Temperature Aggregation, William Nordhaus

 

 

 

 

 

 

 

 

 

Last week I mentioned some of the inherent upward biases in the earth’s more recent surface temperature record. Measuring a “global” air temperature at the surface is an enormously complex task, requiring the aggregation of measurements taken using different methods and instruments (land stations, buoys, water buckets, ship water intakes, different kinds of thermometers) at points that are unevenly distributed across latitudes, longitudes, altitudes, and environments (sea, forest, mountain, and urban). Those measurements must be extrapolated to surrounding areas that are usually large and environmentally diverse. The task is made all the more difficult by the changing representation of measurements taken at these points, and changes in the environments at those points over time (e.g., urbanization). The spatial distribution of reports may change systematically and unsystematically with the time of day (especially onboard ships at sea).

The precision with which anything can be measured depends on the instrument used. Beyond that, there is often natural variation in the thing being measured. Some thermometers are better than others, and the quality of these instruments has varied tremendously over the roughly 165-year history of recorded land temperatures. The temperature itself at any location is subject to variation as the air shifts, but temperature readings are like snapshots taken at points in time, and may not be representative of areas nearby. In fact, the number of land weather stations used in constructing global temperatures has declined drastically since the 1970s, which implies an increasing error in approximating temperatures within each expanding area of coverage.

The point is that a statistical range of variation exists around each temperature measurement, and there is additional error introduced by vagaries of the aggregation process. David Henderson and Charles Hooper discuss the handling of temperature measurement errors in aggregation and in discussions of climate change. The upward trend in the “global” surface temperature between 1856 and 2004 was about 0.8° C, but a 95% confidence interval around that change is ±0.98° C. (I believe that is probably small given the sketchiness of the early records.) In other words, from a statistical perspective, one cannot reject the hypothesis that the global surface temperature was unchanged for the full period.

Henderson and Hooper make some other salient points related to the negligible energy impulse from carbon forcings relative to the massive impact of variations in solar energy and the uncertainty around the behavior of cloud formation. It’s little wonder that climate models relying on a carbon-forcing impact have erred so widely and consistently.

In addition to reinforcing the difficulty of measuring surface temperatures and modeling the climate, the implication of the Henderson and Hooper article is that policy should not be guided by measurements and models subject to so much uncertainty and such minor impulses or “signals”. The sheer cost of abating carbon emissions is huge, though some alternative means of doing so are better than others. Costs increase as the degree of abatement increases (or replacement of low-carbon alternatives), and I suspect that the incremental benefit decreases. Strict limits on carbon emissions reduce economic output. On a broad scale, that would impose a sacrifice of economic development and incomes in the non-industrialized world, not to mention low-income minorities in the developed world. One well-known estimate by William Nordhaus involved a 90% reduction in world carbon emissions by 2050. He calculated a total long-run cost of between $17 trillion and $22 trillion. Annually, the cost was about 3.5% of world GDP. The climate model Nordhaus used suggested that the reduction in global temperatures would be between 1.3º and 1.6º C, but in view of the foregoing, that range is highly speculative and likely to be an extreme exaggeration. And note the small width of the “confidence interval”. That range is not at all a confidence interval in the usual sense; it is a “stab” at the uncertainty in a forecast of something many years hence.  Nordhaus could not possibly have considered all sources of uncertainty in arriving at that range of temperature change, least of all the errors in measuring global temperature to begin with.

Climate change activists would do well to spend their Earth Day educating themselves about the facts of surface temperature measurement. Their usual prescription is to extract resources and coercively deny future economic gains in exchange for steps that might or might not solve a problem they insist is severe. The realities are that the “global temperature” is itself subject to great uncertainty, and its long-term trend over the historical record cannot be distinguished statistically from zero. In terms of impacting the climate, natural forces are much more powerful than carbon forcings. And the models on which activists depend are so rudimentary, and so error prone and biased historically, that taking your money to solve the problem implied by their forecasts is utter foolishness.

Better Bids and No Bumpkins

18 Tuesday Apr 2017

Posted by Nuetzel in Air Travel, Property Rights, Secondary Markets

≈ Leave a comment

Tags

Bumping, Denied Boardings, Department of Transportation, Frequent Flier Miles, Involuntary Bumps, John Cochrane, Julian Simon, Secondary markets, TSA, United Airlines, Voluntary Bumps

United Airlines‘ mistreatment of a passenger last week in Chicago had nothing to do with overbooking, but commentary on the issue of overbooking is suddenly all the rage. The fiasco in Chicago began when four United employees arrived at the gate after a flight to Louisville had boarded. The flight was not overbooked, just full, but the employees needed to get to Louisville. United decided to “bump” four passengers to clear seats for the employees. They used an algorithm to select four passengers to be bumped based on factors like lowest-fare-paid and latest purchase. The four passengers were offered vouchers for a later flight and a free hotel night in Chicago. Three of the four agreed, but the fourth refused to budge. United enlisted the help of Chicago airport security officers, who dragged the unwilling victim off the flight, bloodying him in the process. It was a terrible day for United‘s public relations, and the airline will probably end up paying an expensive out-of-court settlement to the mistreated passenger.

Putting the unfortunate Chicago affair aside, is over-booking a big problem? Airlines always have cancellations, so they overbook in order to keep the seats filled. That means higher revenue and reduced costs on a per passenger basis. Passengers are rarely bumped from flights involuntarily: about 0.005% in the fourth quarter of 2016, according to the U.S. Department of Transportation. “Voluntarily denied boardings” are much higher: about 0.06%. Both of these figures seem remarkably low as “error rates”, in a manner of speaking.

Issues like the one in Chicago do not arise under normal circumstances because “bumps” are usually resolved before boarding takes place, albeit not always to everyone’s satisfaction. Still, if airlines were permitted (and willing) to bid sufficiently high rates of compensation to bumped ticket-holders, there would be no controversy at all. All denied boardings would be voluntary. There are a few other complexities surrounding the rules for compensation, which depend on estimates of the extra time necessary for a bumped traveler to reach their final destination. If less than an extra hour, for example, then no compensation is required. In other circumstances, the maximum compensation level allowed by the government is $1,300. These limits can create an impasse if a passenger is unwilling to accept the offer (or non-offer when only an hour is at stake). The only way out for the airline, in that case, is an outright taking of the passenger’s boarding rights. Of course, this possibility is undoubtedly in the airline’s “fine print” at the time of the original purchase.

No cap on a bumped traveler’s compensation was anticipated when economist Julian Simon first proposed such a scheme in 1968:

“The solution is simple. All that need happen when there is overbooking is that an airline agent distributes among the ticket-holders an envelope and a bid form, instructing each person to write down the lowest sum of money he is willing to accept in return for waiting for the next flight. The lowest bidder is paid in cash and given a ticket for the next flight. All other passengers board the plane and complete the flight to their destination.“

Today’s system is a simplified version of Simon’s suggestion, and somewhat bastardized, given the federal caps on compensation. If the caps were eliminated without other offsetting rule changes, would the airlines raise their bids sufficiently to eliminate most involuntary bumps? There would certainly be pressure to do so. Of course, the airlines already get to keep the fares paid on no-shows if they are non-refundable tickets.

John Cochrane makes another suggestion: limit ticket sales to the number of seats on the plane and allow a secondary market in tickets to exist, just as resale markets exist for concert and sports tickets. Bumps would be a thing of the past, or at least they would all be voluntary and arranged for mutual gain by the buyers and sellers. Some say that peculiarities of the airline industry argue that the airlines themselves would have to manage any resale market in their own tickets (see the comments on Cochrane’s post). That includes security issues, tickets with special accommodations for disabilities, meals, or children, handling transfers of frequent flier miles along with the tickets, and senior discounts.

Conceivably, trades on such a market could take place right up to the moment before the doors are closed on the plane. Buyers would still have to go through security, however, and you need a valid boarding pass to get through security. That might limit the ability of the market to clear in the final moments before departure: potential buyers would simply not be on hand.  Only those already through security, on layovers, or attempting to rebook on the concourse  could participate without changes in the security rules. Perhaps this gap could be minimized if last-minute buyers qualified for TSA pre-check. Also, with the airline’s cooperation, electronic boarding passes must be made changeable so that the new passenger’s name would match his or her identification. Clearly, the airlines would have to be active participants in arranging these trades, but a third-party platform for conducting trades is not out-of the question.

Could other concerns about secondary trading be resolved ion a third-party platform? Probably, but again, solutions would require participation by the airlines. Trading miles along with the ticket could be made optional (after all, the miles would have a market value), but the trade of miles would have to be recorded by the airline. The tickets themselves could trade just as they were sold originally by the airline, whether the accommodations are still necessary or not. The transfer of a discounted ticket might obligate the buyer to pay the airline a sum equal to the discount unless they qualified under the same discount program. All of these problems could be resolved.

Would the airlines want a secondary market in their tickets? Probably not. If there are gains to be made on resale, they would rather capture as much of it as they possibly can. The federal caps on compensation to bumped fliers give the airlines a break in that regard, and they should be eliminated in the interests of consumer welfare. Let’s face it, the airlines know the that a seat on an over-booked flight is a scarce resource; the owner (the original ticker buyer) should be paid fair market value if the airline wants to take their ticket for someone else. Airlines must increase their bids until the market clears, which means that fliers would never be bumped involuntarily. A secondary market in tickets, however, would obviate the practice of over-booking and allow fliers to capture the gain in exchange for surrendering their ticket. Once purchased, it belongs to them.

Playing Pretend Science Over Cocktails

13 Thursday Apr 2017

Posted by Nuetzel in Global Warming

≈ 2 Comments

Tags

97% Consensus, AGW, Carbon Forcing Models, Climate Feedbacks, CO2 and Greening, East Anglia University, Hurricane Frequency, Judith Curry, Matt Ridley, NOAA, Paleoclimate, Peer Review Corruption, Ross McKitrick, Roy Spencer, Sea Levels, Steve McIntyre, Temperature Proxies, Urbanization Bias

It’s a great irony that our educated and affluent classes have been largely zombified on the subject of climate change. Their brainwashing by the mainstream media has been so effective that these individuals are unwilling to consider more nuanced discussions of the consequences of higher atmospheric carbon concentrations, or any scientific evidence to suggest contrary views. I recently attended a party at which I witnessed several exchanges on the topic. It was apparent that these individuals are conditioned to accept a set of premises while lacking real familiarity with supporting evidence. Except in one brief instance, I avoided engaging on the topic, despite my bemusement. After all, I was there to party, and I did!

The zombie alarmists express their views within a self-reinforcing echo chamber, reacting to each others’ virtue signals with knowing sarcasm. They also seem eager to avoid any “denialist” stigma associated with a contrary view, so there is a sinister undercurrent to the whole dynamic. These individuals are incapable of citing real sources and evidence; they cite anecdotes or general “news-say” at best. They confuse local weather with climate change. Most of them haven’t the faintest idea how to find real research support for their position, even with powerful search engines at their disposal. Of course, the search engines themselves are programmed to prioritize the very media outlets that profit from climate scare-mongering. Catastrophe sells! Those media outlets, in turn, are eager to quote the views of researchers in government who profit from alarmism in the form of expanding programs and regulatory authority, as well as researchers outside of government who profit from government grant-making authority.

The Con in the “Consensus”

Climate alarmists take assurance in their position by repeating the false claim that  97% of climate scientists believe that human activity is the primary cause of warming global temperatures. The basis for this strong assertion comes from an academic paper that reviewed other papers, the selection of which was subject to bias. The 97% figure was not a share of “scientists”. It was the share of the selected papers stating agreement with the anthropomorphic global warming (AGW) hypothesis. And that figure is subject to other doubts, in addition to the selection bias noted above: the categorization into agree/disagree groups was made by “researchers” who were, in fact, environmental activists, who counted several papers written by so-called “skeptics” among the set that agreed with the strong AGW hypothesis. So the “97% of scientists” claim is a distortion of the actual findings, and the findings themselves are subject to severe methodological shortcomings. On the other hand, there are a number of widely-recognized, natural reasons for climate change, as documented in this note on 240 papers published over just the first six months of 2016.

Data Integrity

It’s rare to meet a climate alarmist with any knowledge of how temperature data is actually collected. What exactly is the “global temperature”, and how can it be measured? It is a difficult undertaking, and it wasn’t until 1979 that it could be done with any reliability. According to Roy Spencer, that’s when satellite equipment began measuring:

“… the natural microwave thermal emissions from oxygen in the atmosphere. The intensity of the signals these microwave radiometers measure at different microwave frequencies is directly proportional to the temperature of different, deep layers of the atmosphere.“

Prior to the deployment of weather satellites, and starting around 1850, temperature records came only from surface temperature readings. These are taken at weather stations on land and collected at sea, and they are subject to quality issues that are generally unappreciated. Weather stations are unevenly distributed and they come and go over time; many of them produce readings that are increasingly biased upward by urbanization. Sea surface temperatures are collected in different ways with varying implications for temperature trends. Aggregating these records over time and geography is a hazardous undertaking, and these records are, unfortunately, the most vulnerable to manipulation.

The urbanization bias in surface temperatures is significant. According to this paper by Ross McKitrick, the number of weather stations counted in the three major global temperature series declined by more than 4,500 since the 1970s (over 75%), and most of those losses were rural stations. From McKitrick’s abstract:

“The collapse of the sample size has increased the relative fraction of data coming from airports to about 50% (up from about 30% in the late 1970s). It has also reduced the average latitude of source data and removed relatively more high altitude monitoring sites. Oceanic data are based on sea surface temperature (SST) instead of marine air temperature (MAT)…. Ship-based readings changed over the 20th century from bucket-and-thermometer to engine-intake methods, leading to a warm bias as the new readings displaced the old.“

Think about that the next time you hear about temperature records, especially NOAA reports on a “new warmest month on record”.

Data Manipulation

It’s rare to find alarmists having any awareness of the scandal at East Anglia University, which involved data falsification by prominent members of the climate change “establishment”. That scandal also shed light on corruption of the peer-review process in climate research, including a bias against publishing work skeptical of the accepted AGW narrative. Few are aware now of a very recent scandal involving manipulation of temperature data at NOAA in which retroactive adjustments were applied in an effort to make the past look cooler and more recent temperatures warmer. There is currently an FOIA outstanding for communications between the Obama White House and a key scientist involved in the scandal. Here are Judith Curry’s thoughts on the NOAA temperature manipulation.

Think about all that the next time you hear about temperature records, especially NOAA reports on a “new warmest month on record”.

Other Warming Whoppers

Last week on social media, I noticed a woman emoting about the way hurricanes used to frighten her late mother. This woman was sharing an article about the presumed negative psychological effects that climate change was having on the general public. The bogus premises: we are experiencing an increase in the frequency and severity of storms, that climate change is causing the storms, and that people are scared to death about it! Just to be clear, I don’t think I’ve heard much in the way of real panic, and real estate prices and investment flows don’t seem to be under any real pressure. In fact, the frequency and severity of severe weather has been in decline even as atmospheric carbon concentrations have increased over the past 50 years.

I heard another laughable claim at the party: that maps are showing great areas of the globe becoming increasingly dry, mostly at low latitudes. I believe the phrase “frying” was used. That is patently false, but I believe it’s another case in which climate alarmists have confused model forecasts with fact.

The prospect of rising sea levels is another matter that concerns alarmists, who always fail to note that sea levels have been increasing for a very long time, well before carbon concentrations could have had any impact. In fact, the sea level increases in the past few centuries are a rebound from lows during the Little Ice Age, and levels are now back to where the seas were during the Medieval Warm Period. But even those fluctuations look minor by comparison to the increases in sea levels that occurred over 8,000 years ago. Sea levels are rising at a very slow rate today, so slowly that coastal construction is proceeding as if there is little if any threat to new investments. While some of this activity may be subsidized by governments through cheap flood insurance, real money is on the line, and that probably represents a better forecast of future coastal flooding than any academic study can provide.

Old Ideas Die Hard

Two enduring features of the climate debate are 1) the extent to which so-called “carbon forcing” models of climate change have erred in over-predicting global temperatures, and 2) the extent to which those errors have gone unnoticed by the media and the public. The models have been plagued by a number of issues: the climate is not a simple system. However, one basic shortcoming has to do with the existence of strong feedback effects: the alarmist community has asserted that feedbacks are positive, on balance, magnifying the warming impact of a given carbon forcing. In fact, the opposite seems to be true: second-order responses due to cloud cover, water vapor, and circulation effects are negative, on balance, at least partially offsetting the initial forcing.

Fifty Years Ain’t History

One other amazing thing about the alarmist position is an insistence that the past 50 years should be taken as a permanent trend. On a global scale, our surface temperature records are sketchy enough today, but recorded history is limited to the very recent past. There are recognized methods for estimating temperatures in the more distant past by using various temperature proxies. These are based on measurements of other natural phenomenon that are temperature-sensitive, such as ice cores, tree rings, and matter within successive sediment layers such as pollen and other organic compounds.

The proxy data has been used to create temperature estimates into the distant past. A basic finding is that the world has been this warm before, and even warmer, as recently as 1,000 years ago. This demonstrates the wide range of natural variation in the climate, and today’s global temperatures are well within that range. At the party I mentioned earlier, I was amused to hear a friend say, “Ya’ know, Greenland isn’t supposed to be green”, and he meant it! He is apparently unaware that Greenland was given that name by Viking settlers around 1000 AD, who inhabited the island during a warm spell lasting several hundred years… until it got too cold!

Carbon Is Not Poison

The alarmists take the position that carbon emissions are unequivocally bad for people and the planet. They treat carbon as if it is the equivalent of poisonous air pollution. The popular press often illustrates carbon emissions as black smoke pouring from industrial smokestacks, but like oxygen, carbon dioxide is a colorless gas and a gas upon which life itself depends.

Our planet’s vegetation thrives on carbon dioxide, and increasing carbon concentrations are promoting a “greening” of the earth. Crop yields are increasing as a result; reforestation is proceeding as well. The enhanced vegetation provides an element of climate feedback against carbon “forcings” by serving as a carbon sink, absorbing increasing amounts of carbon and converting it to oxygen.

Matt Ridley has noted one of the worst consequences of the alarmists’ carbon panic and its influence on public policy: the vast misallocation of resources toward carbon reduction, much of it dedicated to subsidies for technologies that cannot pass economic muster. Consider that those resources could be devoted to many other worthwhile purposes, like bringing electric power to third-world families who otherwise must burn dung inside their huts for heat; for that matter, perhaps the resources could be left under the control of taxpayers who can put it to the uses they value most highly. The regulatory burdens imposed by these policies on carbon-intensive industries represent lost output that can’t ever be recouped, and all in the service of goals that are of questionable value. And of course, the anti-carbon efforts almost certainly reflect a diversion of resources to the detriment of more immediate environmental concerns, such as mitigating truly toxic industrial pollutants.

The priorities underlying the alarm over climate change are severely misguided. The public should demand better evidence than consistently erroneous model predictions and manipulated climate data. Unfortunately, a media eager for drama and statism is complicit in the misleading narrative.

FYI: The cartoon at the top of this post refers to the climate blog climateaudit.org. The site’s blogger Steve McIntyre did much to debunk the “hockey stick” depiction of global temperature history, though it seems to live on in the minds of climate alarmists. McIntyre appears to be on an extended hiatus from the blog.

Courts and Their Administrative Masters

04 Tuesday Apr 2017

Posted by Nuetzel in Big Government, Regulation

≈ 1 Comment

Tags

Administrative Law, Administrative State, Chevron Deference, Chevron USA, Clyde Wayne Crews, Competitive Enterprise Institute, Ilya Somin, Jonathan Adler, Kent Jordan, Natural Resources Defense Council, Neil Gorsuch, Philip Hamburger, Regulatory Dark Matter, Separation of Powers

IMG_4007

Supreme Court nominee Neil Gorsuch says the judicial branch should not be obliged to defer to government agencies within the executive branch in interpreting law. Gorsuch’s  opinion, however, is contrary to an established principle guiding courts since the 1984 Supreme Court ruling in Chevron USA vs. The Natural Resources Defense Council. In what is known as Chevron deference, courts apply a test of judgement as to whether the administrative agency’s interpretation of the law is “reasonable”, even if other “reasonable” interpretations are possible. This gets particularly thorny when the original legislation is ambiguous with respect to a certain point. Gorsuch believes the Chevron standard subverts the intent of Constitutional separation of powers and judicial authority, a point of great importance in an age of explosive growth in administrative rule-making at the federal level.

Ilya Somin offers a defense of Gorsuch’s position on Chevron deference, stating that it violates the text of the Constitution authorizing the judiciary to decide matters of legal dispute without ceding power to the executive branch. The agencies, for their part, seem to be adopting increasingly expansive views of their authority:

“Some scholars argue that in many situations, agencies are not so much interpreting law, but actually making it by issuing regulations that often have only a tenuous basis in congressional enactments. When that happens, Chevron deference allows the executive to usurp the power of Congress as well as that of the judiciary.”

Jonathan Adler quotes a recent decision by U.S. Appeals Court Judge Kent Jordan in which he expresses skepticism regarding the wisdom of Chevron deference:

Deference to agencies strengthens the executive branch not only in a particular dispute under judicial review; it tends to the permanent expansion of the administrative state. Even if some in Congress want to rein an agency in, doing so is very difficult because of judicial deference to agency action. Moreover, the Constitutional requirements of bicameralism and presentment (along with the President’s veto power), which were intended as a brake on the federal government, being ‘designed to protect the liberties of the people,’ are instead, because of Chevron, ‘veto gates’ that make any legislative effort to curtail agency overreach a daunting task.

In short, Chevron ‘permit[s] executive bureaucracies to swallow huge amounts of core judicial and legislative power and concentrate federal power in a way that seems more than a little difficult to square with the Constitution of the [F]ramers’ design.’

The unchecked expansion of administrative control is a real threat to the stability of our system of government, our liberty, and the health of our economic system. It imposes tremendous compliance costs on society and often violates individual property rights. Regulatory actions are often taken without performing a proper cost-benefit analysis, and the decisions of regulators may be challenged initially only within a separate judicial system in which courts are run by the agencies themselves! I covered this point in more detail one year ago in “Hamburger Nation: An Administrative Nightmare“, based on Philip Hamburger’s book “Is Administrative Law Unlawful?“.

Clyde Wayne Crews of the Competitive Enterprise Institute gives further perspective on the regulatory-state-gone-wild in “Mapping Washington’s Lawlessness: An Inventory of Regulatory Dark Matter“. He mentions some disturbing tendencies that may go beyond the implementation of legislative intent: agencies sometimes choose to wholly ignore some aspects of legislation; agencies tend to apply pressure on regulated entities on the basis of interpretations that stretch the meaning of such enabling legislation as may exist; and as if the exercise of extra-legislative power were not enough, administrative actions have a frequent tendency to subvert the price mechanism in private markets, disrupting the flow of accurate information about resource-scarcity and the operation of incentives that give markets their great advantages. All of these behaviors fit Crews’ description of “regulatory dark matter.”

Chevron deference represents an unforced surrender by the judicial branch to the exercise of power by the executive. As Judge Jordan notes in additional quotes provided by Adler at a link above, this does not deny the usefulness or importance of an agency’s specialized expertise. Nevertheless, the courts should not abdicate their role in reviewing an agency’s developmental evidence for any action, and the reasonability of an agency’s applications of evidence relative to alternative courses of action. Nor should the courts abdicate their role in ruling on the law itself. Judge Gorsuch is right: Chevron deference should be re-evaluated by the courts.

Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Tariffs, Content Quotas, and What Passes for Patriotism
  • Carbon Credits and Green Bonds Are Largely Fake
  • The Wasteful Nature of Recycling Mandates
  • Broken Windows: Destroying Wealth To Create Green Jobs
  • The Oceans and Global Temperatures

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Ominous The Spirit
  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library

Blog at WordPress.com.

Ominous The Spirit

Ominous The Spirit is an artist that makes music, paints, and creates photography. He donates 100% of profits to charity.

Passive Income Kickstart

OnlyFinance.net

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

  • Follow Following
    • Sacred Cow Chips
    • Join 121 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...