D’Amato is on the Board of Policy Advisors for the Heartland Institute and he is the Benjamin Tucker Research Fellow at the Molinari Institute’s Center for a Stateless Society. He earned a JD from New England School of Law and an LLM in Global Law and Technology from Suffolk University Law School.
The disciplines we call the “hard sciences” such as chemistry and physics inhabit the cold, sterile world of laboratories, uncontaminated by a boundless assortment of potential impurities.
Today, economics pretends to be one of the hard sciences, yet the laboratories provided by the real world are disorderly, even chaotic, insusceptible to sanitization and control. The information we are able to glean from them is thus limited in its power to offer laws of general applicability, to explain the world. It is not that there is a lack of potentially useful information; after all, economists and social scientists are gathering and analyzing statistical data constantly. Rather, those data are limited by the density of the causal atmosphere of the environment from which they emerge, a rich and variable sea of causes and effects. Isolating one or even a few factors becomes impossible.
As Jim Manzi explains in his book Uncontrolled, “[W]e can never be sure that any experiment actually has controlled for every possible alternative cause of an outcome.” And while this is, of course, true in every field of inquiry, the problem is especially acute within the social sciences, so-called. That’s because, as Manzi observes, “human social organizations have a causal density that dwarfs anything astrophysics considers.”
This proposition — that social and economic phenomena are far more complex than subjects such as astrophysics — may be counterintuitive, but human beings are unlike celestial bodies in important respects. We act unpredictably and often arbitrarily, driven by emotion and by desires that are subjective and incommensurable. Our institutions reflect the caprices of our nature, held together far more by largely arbitrary custom than by anything we might call “science.” Thus, our random, human medley of actions and institutions is difficult to measure and test, its movements defying our predictions. In contrast, the movements of objects in space can be predicted with surprising accuracy, quantum weirdness notwithstanding.
For any given observable phenomenon, the scientist must attempt to parse a convoluted web of actual and potential causes. Unable to control the experiment, its environmental inputs, groups, etc., the social scientist is unable to know whether the hypothesis being tested has been confirmed. This causal density means economic data must always be the subject of several competing explanations, informed by ideology and extra-economic social theory. Political economists such as Adam Smith understood this and did not shy from considering questions of epistemology, ethics, and politics in their economic analyses.
Today’s economists should follow Smith’s lead. Interpretations of data, the stories social scientists tell about what statistical information means, are contingent, dependent on the lenses through which the viewer looks. Thus, the characteristics and clarity of the lenses matter a great deal, supplementing and contextualizing our observations. The great classical liberal political economist Jean-Baptiste Say foresaw the complacency of today’s economists, their tendency to oversell the power of data and mathematics. Anticipating the praxeology of Ludwig von Mises, Say held the proper foundations for economics are “the rigorous deductions of undeniable general facts,” not “new particular fact[s]” (i.e., statistics), but basic laws of human action. A “new particular fact” has its place, of course, but the conscientious social scientist must establish “the connexion between its antecedents and its consequents… by reasoning,” mindful of the intricacy of the chain of causation.
Say was sensitive to the limitless complexity of that which is so facilely called the “economy.” Without the ability to control the experiment, to isolate variables and conditions, how can we be sure “that some unknown circumstance has not produced the difference noticed in their several results?” Say’s political economy was genuinely scientific, his method skeptical and always scrutinizing. He understood that all great falsehoods are supported by facts — facts drifting free of their contextual moorings but facts nonetheless. Rather than discounting the importance of evidence itself, we must undertake a re-evaluation of what counts as evidence, making space for “primary source material and interview and survey work.”
If empirical data are often too messy, too causally intricate, without the help of a philosophical or interpretative framework, then mathematical models are in a sense too neat to tell us very much about reality; they reduce enormously complex concepts and arguments about economic behavior to sterile formulae. Sometimes this is useful, as in the case of an economic model that explains the relationship between supply and demand. But as economists address their model-building processes to more difficult questions, the serviceability of the models diminishes. And if we are to believe the critics of “mathiness,” whom we can find all over the spectrum of ideas, the preoccupation with practically useless mathematical models has all but completely overtaken the economics profession.
Mathematical models, agglomerations of equations using multivariable calculus, are, it turns out, not a language suited to the task of describing something as dynamic as human behavior. Among the axioms of modern economics is the idea that economic value is something assigned to good and services subjectively by individual buyers and sellers. As Austrian School economists frequently point out, there is an irreducible subjectivity at the heart of all economic action. This explanation of value in terms of subjective preference and marginal utility replaced classical theories that made value a function of the quantities of labor expended during a good’s production. If value subjectivism holds, then, for example, one’s partiality for Chicago-style pizza as opposed to New York-style pizza is simply not the kind of preference that can be quantified. There is, as the saying goes, no accounting for taste.
It’s a simple example, but it points to a much more general and far-reaching truth: Formal logic and mathematics are not a stable foundation for the economist. This has been borne out by the inability of computer models to anticipate the movements of actual markets. For all their complex mathematics and pretensions to rigorousness, these models rely on crude oversimplifications. As New York University economist Mario J. Rizzo notes, “Ceteris paribus prediction is prediction of ‘stylized facts,’” whose connection to the real world is tenuous at best.
It is no wonder that such predictions, so precariously balanced on layers of stipulations, should fail; were economics reducible to the mere balancing of mathematical equations, predictions would be perfect, crashes unthinkable. The economic crisis of 2008 confounded economists’ models, their simulated economies betraying their inability to tell us anything useful about the real world. No computer model can explain or express the infinite complexity of an economy, even the smallest, simplest one.
The practical significance of these observations ought to be apparent: Economics provides the basis for much of the way we think about solutions to perceived public policy problems. The quality of those solutions would be served by an economic approach that appreciates the limits and shortcomings of theoretical models and quantitative data and the continuing relevance of verbal economics and intuitive argumentation. Real scientific rigor of the desired kind is a product of academic honesty and a deliberate modesty about the limits of human knowledge and designs.
[Originally Published at American Spectator]