Quantum physics has always had a math problem, one that even in its early days the leading physicists were troubled about. Einstein famously remarked, “God does not play dice with the universe.” Erwin Schrödinger’s thought experiment now known simply as “Schrödinger’s cat” was meant to mock the Copenhagen interpretation rather than help explain it.

But we live in an age of math largely because most of the math seems to work, though almost no one knows how or by how much. If you read on the internet some statistical study that purports to show X, it is commonly accepted as X because the study has become our substitute scientific standard bearing all the criteria that have come to be familiar with what is essentially nothing more than a brand; SCIENCE. Scrutiny underneath the label is almost impossible because what exists there is so complex so as to be utterly incomprehensible to even most intelligent people. The truth is that is the way its practitioners like it, not because it is an infallible tool for discerning the truth, or even what works, but because it makes them the high priests of whatever discipline, afforded a pedestal from which to be forever unchallenged (except by other high priests).

The mathematical problems of quantum theory are solved by a process called renormalization. I won’t get that far into it, as it is as complex as you would guess it to be, but as a matter of mathematics it is a process for sorting out infinities and divergences in equations. That introduces questions of interpretation, meaning whether physicists are deducing something “real” by this process which prioritizes mathematics that aren’t made gibberish by the imposition of infinities. Is there a balance to strike between math made to work and what information might be sacrificed in making the math work?

In quantum physics, it’s far less controversial because the math has proven legitimate over and over and over again (though recent discoveries, and really the lack of them at the LHC, have restarted some of these questions). It isn’t called renormalization in other contexts, but it is largely the same sorting process that takes place in other fields such as Economics (capital “E”).

One of the most widely accepted and basic was rational expectations, where in 1972 Robert Lucas built upon the work of John Muth to produce a rational function that solved an infinity that was holding up the development of general equilibrium models. Economists had been dying for a GE model for years, so when Lucas formulated his shortcut it was widely celebrated without much thought really about it in the real world. He had been working on those interpretations of the function before his breakthrough, writing in a paper published in September 1970:

The relationship, essentially a variant of the well-known Phillips curve, is derived within a framework from which all forms of “money illusion” are rigorously excluded: all prices are market clearing, all agents behave optimally in light of their objectives and expectations, and expectations are formed optimally.

You can see what he was doing and what eventually led to his great advance that won him the Nobel Prize in 1995. If individual agents acted in different ways including some that may not be “rational”, those prone to the so-called money illusion, there would be no way to reduce expectations into a probability distribution with finite, defined properties. Thus, in order to change an unmeasurable one into one that could be described by a rational mathematical function, Lucas gives us what are mere assumptions, written above as “all prices are market clearing” and “all agents behave optimally.”

It is a ridiculous proposition in the real world, largely because it doesn’t happen that way. One need only review asset bubbles that occur in all kinds of circumstances to understand the fallacy, especially as it relates to money. Economists, under econometrics, sacrificed a real world framing of an important economic process for the sake of mathematic expediency. It actually explains a whole lot about the past few decades, especially the Great “Moderation” and its aftermath. For Lucas, and all that have followed him, there were no asset bubbles because their shortcut can’t ever model them. They have been thusly stricken from the canon no matter how many pop up in the real world.

It makes for a very uncomfortable intellectual basis, especially considering it is incongruous how a Great “Moderation” could be followed by a Great “Recession” without a recovery (depression). The title of Lucas’ 1970 paper was Expectations and The Neutrality of Money. The principle of monetary neutrality has become a cornerstone of Economics, and it is all based on renormalization in math, not observation. In fact, Economists have often gone to great lengths to exclude common sense just so that they can arrive at mathematical conclusions preserving neutrality.

This was always a problem because all roads to all these academic efforts all led back to the Great Depression. In June 1983, a young economist made something of a name for himself writing a widely discussed and accepted paper on monetary neutrality and the Great Depression. Ben Bernanke wrote in Nonmonetary Effects of the Financial Crisis in the Propagation of the Great Depression:

One problem is that there is no theory of monetary effects [per se] on the real economy that can explain protracted nonneutrality. Another is that the reductions of the money supply in this period seems quantitatively insufficient to explain the subsequent falls in output.

And this is generally why in 2017 the Federal Reserve has given up on recovery, finally, in search of the smallest evidence that Baby Boomers have retired in huge numbers (and all clustered in late 2008, early 2009) and that the rest of the American labor force is riddled with drug addiction and laziness – there couldn’t possibly be a monetary explanation because math…and Bernanke.

I wrote yesterday about what historically low initial jobless claims could mean in the context of a depression, one that looks more like Japan from the 1990’s forward than the US of the 1930’s. Every modern economy is at its root dependent upon labor. Full stop. A modern economy is nothing but labor specialization, therefore it cannot proceed where labor might be so imbalanced without some other effect creating an offsetting position. Monetary neutrality but especially rational expectations means Economists cannot consider the housing bubble (really eurodollar bubbles) as a candidate for why the US offshored so much labor capacity but its economy kept going if only until 2008.

But where duty to Economics requires ignorance, we are under no such restraint. In fact, there is a great deal of literature supporting at least the first part – that the decline in labor’s share of economic output is related to what Ross Perot once declared would be a “giant sucking sound.” In 2013, for example, the Brookings Institute, the same outfit that now also sees fit to include Ben Bernanke as a scholar and distribute his post-Fed opinions, published a paper seeking to explain, by regressions, of course, the reasons for the decline in labor’s share. Author’s Michael Elsby, Bart Hobijn (of FRBSF), and Aysegul Sahin (of FRBNY) write:

Finally, our analysis identifies offshoring of the labor-intensive component of the U.S. supply chain as a leading potential explanation of the decline in the U.S. labor share over the past 25 years.

Their calculations for that decline largely match what I presented yesterday, especially the likely inflection importantly dating to around 1984.

The paper also identifies a global issue with labor compensation, where even in countries like China that have been on the receiving end of offshoring (from the US perspective) the imbalance is still present.

It is worth noting that this account is plausibly consistent with declines in labor shares not only in source countries, such as the U.S., but also in the destination countries to which production is offshored, such as China. This is important if, as Karabarbounis and Neiman (2013) suggest, declines in labor’s share have occurred globally. In particular, it is possible that offshored production processes that are labor-intensive by U.S. standards also are capital-intensive relative to existing production in China. [emphasis added]

This is an enormously important point, one that I don’t think the authors fully appreciated outside the narrow context of their paper. And it is in this point where the monetary explanation for the last few decades escapes the confines of assumed neutrality. In other words, as I have written before, NAFTA was only one part of the “giant sucking sound”, the other more prerequisite part was the eurodollar system. It was never enough that these countries held out a deep pool of cheap labor, there had to be a mechanism to mobilize it from the view of finance: from new facilities to logistics of resources (and even graft) and then to keep it all liquid and flowing.

That was the eurodollar, and I believe that is why the “giant sucking sound” didn’t happen immediately following NAFTA. It wasn’t until after 1995 and really the collapse of the dot-com bubble where the eurodollar system truly got going; and then US manufacturing jobs did.

The 2013 Brookings paper even cites prior evidence as to these nascent conditions:

Feenstra and Hanson (1996b), for example, note the importance of increased capital flows from the United States to Mexico in reconciling wage movements associated with offshoring between these two economies. [emphasis added]

Working backward, then, without the eurodollar’s monetary and credit capabilities there may not have been so much offshoring, therefore the US economic position may not have been so weakened, and the massive imbalances of the early 21st century corrected before there was ever a Great “Recession” leading to depression. I would call that protracted nonneutrality.

In its conclusions, the authors figure that “the import exposure of U.S. businesses can account for 3.3 percentage points of the 3.9 percentage point decline in the U.S. payroll share over the past quarter century.” Given that this evidence is, too, a regression, take it for what it may be worth. Without that payroll share, however, the global economy was highly susceptible to a monetary reversal, which did come in waves, only starting in 2008 but reaching the EM side more clearly in 2013 and the “rising dollar” after. That has left the world economy stuck in a depression that Economists will never understand so long as they irrationally prioritize renormalized mathematics over economics (small “e”).

Economists are mathematicians first and hold very little expertise about economy outside of statistics. In other words, we don’t actually need to be afraid of the complex equations because it’s not necessary to limit our challenges to them to their own mathematical terms (that’s what the media and politicians do).  But even when you do enter their alternate statistical reality, it’s not really all that difficult, either. If it seems like Economists can’t get anything right it’s because they can’t. Economics is a mathematical illusion standing in for competence, nothing more. Until politicians stop cowering in fear of it, however, we are all stuck until something really breaks

Ignorance as established doctrine is no longer tolerable and hasn’t really been for a full decade now. So long as it is tolerated, this depression will continue, at least until that can no longer be tolerated.