Tag Archives: economics

The crisis was not predicted because crises aren’t predictable?

There’s a terrific essay on economics by John Kay on the INET blog. Some juicy excerpts follow, but it’s really worth the trip to read the whole thing. They’ve invited some other economists to respond, which should be interesting.

The Map is Not the Territory: An Essay on the State of Economics

by JOHN KAY

The reputation of economics and economists, never high, has been a victim of the crash of 2008. The Queen was hardly alone in asking why no one had predicted it. An even more serious criticism is that the economic policy debate that followed seems only to replay the similar debate after 1929. The issue is budgetary austerity versus fiscal stimulus, and the positions of the protagonists are entirely predictable from their previous political allegiances.

The doyen of modern macroeconomics, Robert Lucas, responded to the Queen’s question in a guest article in The Economist in August 2009.[1] The crisis was not predicted, he explained, because economic theory predicts that such events cannot be predicted. Faced with such a response, a wise sovereign will seek counsel elsewhere.

[...]All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.

But Lucas and those who follow him were plainly engaged in a very different exercise, as the philosopher Nancy Cartwright has explained.[4] The distinguishing characteristic of their approach is that the list of unrealistic simplifying assumptions is extremely long. Lucas was explicit about his objective[5] – ‘the construction of a mechanical artificial world populated by interacting robots that economics typically studies’. An economic theory, he explains, is something that ‘can be put on a computer and run’. Lucas has called structures like these ‘analogue economies’, because they are, in a sense, complete economic systems. They loosely resemble the world, but a world so pared down that everything about them is either known, or can be made up. Such models are akin to Tolkien’s Middle Earth, or a computer game like Grand Theft Auto.

[... interesting discussion of the fiscal crisis as a debate over Ricardian equivalence ...]
But another approach would discard altogether the idea that the economic world can be described by a universally applicable model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but which cannot be described fully, or perhaps at all, by the kinds of variables and equations with which economists are familiar. Models, when employed, must therefore be context specific, in the manner suggested in a recent book by Roman Frydman and Michael Goldberg.[8]

[...]

But you would not nowadays be able to publish similar articles in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. You might be accused of the cardinal sin of being ‘ad hoc’. Rigour and consistency are the two most powerful words in economics today.

[...]

Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are complete artificial worlds, like those of Grand Theft Auto, which can ‘be put on a computer and run’.

For many people, deductive reasoning is the mark of science, while induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty’, says Cochrane, ‘is logical consistency’. It seems impossible that anyone acquainted with great human achievements – whether in the arts, the humanities or the sciences – could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso – or Newton or Darwin – approached their task.

[...] Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word. With exquisite irony, Lucas holds a chair named for John Dewey, the theorist of American pragmatism.

[...] The modern economist is the clinician with no patients, the engineer with no projects. And since these economists do not appear to engage with the issues that confront real businesses and actual households, the clients do not come.There are, nevertheless, many well paid jobs for economists outside academia. Not, any more, in industrial and commercial companies, which have mostly decided economists are of no use to them. Business economists work in financial institutions, which principally use them to entertain their clients at lunch or advertise their banks in fillers on CNBC. Economic consulting employs economists who write lobbying documents addressed to other economists in government or regulatory agencies.

[...]A review of economics education two decades ago concluded that students should be taught ‘to think like economists’. But ‘thinking like an economist’ has come to be interpreted as the application of deductive reasoning based on a particular set of axioms. Another Chicago Nobel Prize winner, Gary Becker, offered the following definition: ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently form the heart of the economic approach’.[13] Becker’s Nobel citation rewards him for ‘having extended the domain of microeconomic analysis to a wide range of economic behavior.’ But such extension is not an end in itself: its value can lie only in new insights into that behaviour.

‘The economic approach’ as described by Becker is not, in itself, absurd. What is absurd is the claim to exclusivity he makes for it: a priori deduction from a particular set of unrealistic simplifying assumptions is not just a tool but ‘the heart of the economic approach’. A demand for universality is added to the requirements of consistency and rigour. Believing that economics is like they suppose physics to be – not necessarily correctly – economists like Becker regard a valid scientific theory as a representation of the truth – a description of the world that is independent of time, place, context, or the observer. [...]

The further demand for universality with the consistency assumption leads to the hypothesis of rational expectations and a range of arguments grouped under the rubric of ‘the Lucas critique’. If there were to be such a universal model of the economic world, economic agents would have to behave as if they had knowledge of it, or at least as much knowledge of it as was available, otherwise their optimising behaviour be inconsistent with the predictions of the model. This is a reductio ad absurdum argument, which demonstrates the impossibility of any universal model – since the implications of the conclusion for everyday behaviour are preposterous, the assumption of model universality is false.

[...]Economic models are no more, or less, than potentially illuminating abstractions. Another philosopher, Alfred Korzybski, puts the issue more briefly: ‘the map is not the territory’.[15] Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic.

This is true for analysis of the financial market crisis of 2008. Lucas’s assertion that ‘no one could have predicted it’ contains an important, though partial, insight. There can be no objective basis for a prediction of the kind ‘Lehman Bros will go into liquidation on September 15’, because if there were, people would act on that expectation and, most likely, Lehman would go into liquidation straight away. The economic world, far more than the physical world, is influenced by our beliefs about it.

Such thinking leads, as Lucas explains, directly to the efficient market hypothesis – available knowledge is already incorporated in the price of securities. [...]

In his Economist response, Lucas acknowledges that ‘exceptions and anomalies’ to the efficient market hypothesis have been discovered, ‘but for the purposes of macroeconomic analyses and forecasts they are too small to matter’. But how could anyone know, in advance not just of this crisis but also of any future crisis, that exceptions and anomalies to the efficient market hypothesis are ‘too small to matter’?

[...]The claim that most profit opportunities in business or in securities markets have been taken is justified. But it is the search for the profit opportunities that have not been taken that drives business forward, the belief that profit opportunities that have not been arbitraged away still exist that explains why there is so much trade in securities. Far from being ‘too small to matter’, these deviations from efficient market assumptions, not necessarily large, are the dynamic of the capitalist economy.

[...]

The preposterous claim that deviations from market efficiency were not only irrelevant to the recent crisis but could never be relevant is the product of an environment in which deduction has driven out induction and ideology has taken over from observation. The belief that models are not just useful tools but also are capable of yielding comprehensive and universal descriptions of the world has blinded its proponents to realities that have been staring them in the face. That blindness was an element in our present crisis, and conditions our still ineffectual responses. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.

The first response, from Paul Davidson, is already in.

Economists in the bathtub

Env-Econ is one of several econ sites to pick up on standupeconomist Yoram Bauman’s assessment, Grading Economics Textbooks on Climate Change.

Most point out the bad, but there’s also a lot of good. On Bauman’s curve, there are 4 As, 3 Bs, 5 Cs, 3 Ds, and one F. Still, the bad tends to be really bad. Bauman writes about one,

Overall, the book is not too bad if you ignore that it’s based on climate science that is almost 15 years out of date and that it has multiple errors that would make Wikipedia blush. The fact that this textbook has over 20 percent of the market shakes my faith in capitalism.

The interesting thing is that the worst textbooks go astray more on the science than on the economics. The worst cherry-pick outdated studies, distort the opinions of scientists, and toss in red herrings like “For Greenland, a warming climate is good economic news.”

I find the most egregious misrepresentation in Schiller’s The Economy Today (D+):

The earth’s climate is driven by solar radiation. The energy the sun absorbs must be balanced by outgoing radiation from the earth and the atmosphere. Scientists fear that a flow imbalance is developing. Of particular concern is a buildup of carbon dioxide (CO2) that might trap heat in the earth’s atmosphere, warming the planet. The natural release of CO2 dwarfs the emissions from human activities. But there’s a concern that the steady increase in man-made CO2 emissions—principally from burning fossil fuels like gasoline and coal—is tipping the balance….

First, there’s no “might” about the fact that CO2 traps heat (infrared radiation); the only question is how much, when feedback effects come into play.  But the bigger issue is Schiller’s implication about the cause of atmospheric CO2 buildup. Here’s a picture of Schiller’s words, with arrow width scaled roughly to actual fluxes:

CO2flows1

Apparently, nature is at fault for increasing atmospheric CO2. This is like worrying that the world will run out of air, because people are inhaling it all (Schiller may be inhaling something else). The reality is that the natural flux, while large, is a two way flow:

CO2flows2

What goes into the ocean and biosphere generally comes out again. For the last hundred centuries, those flows were nearly equal (i.e. zero net flow). But now that humans are emitting a lot of carbon, the net flow is actually from the atmosphere into natural systems, like this:

CO2flows3

That’s quite a different situation. If an author can’t paint an accurate verbal picture of a simple stock-flow system like this, how can a text help students learn to manage resources, money or other stocks?

Hope is not a method

My dad pointed me to this interesting Paul Romer interview on BBC Global Business. The BBC describes Romer as an optimist in a dismal science. I think of Paul Romer as one of the economists who founded the endogenous growth movement, though he’s probably done lots of other interesting things. I personally find the models typically employed in the endogenous growth literature to be silly, because they retain assumptions like perfect foresight (we all know that hard optimal control math is the essence of a theory of behavior, right?).  In spite of their faults, those models are a huge leap over earlier work (exogenous technology) and subsequent probing around the edges has sparked a very productive train of thought.

About 20 minutes in, Romer succumbs to the urge to bash the Club of Rome (economists lose their union card if they don’t do this once in a while). His reasoning is part spurious, and part interesting. The spurious part is is blanket condemnation, that simply everything about it was wrong. That’s hard to accept or refute, because it’s not clear if he means Limits to Growth, or the general tenor of the discussion at the time. If he means Limits, he’s making the usual straw man mistake. To be fair, the interviewer does prime him by (incorrectly) summarizing the Club of Rome argument as “running out of raw materials.” But Romer takes the bait, and adds, “… they were saying the problem is we wouldn’t have enough carbon resources, … the problem is we have way too much carbon resources and are going to burn too much of it and damage the environment….” If you read Limits, this was actually one of the central points – you may not know which limit is binding first, but if you dodge one limit, exponential growth will quickly carry you to the next.

Interestingly, Romer’s solution to sustainability challenges arises from a more micro, evolutionary perspective rather than the macro single-agent perspective in most of the growth  literature. He argues against top-down control and for structures (like his charter cities) that promote greater diversity and experimentation, in order to facilitate the discovery of new ways of doing things. He also talks a lot about rules as a moderator for technology – for example, that it’s bad to invent a technology that permit greater harvest of a resource, unless you also invent rules that ensure the harvest remains sustainable. I think he and I and the authors of Limits would actually agree on many real-world policy prescriptions.

However, I think Romer’s bottom-up search for solutions to human problems through evolutionary innovation is important but will, in the end, fail in one important way. Evolutionary pressures within cities, countries, or firms will tend to solve short-term, local problems. However, it’s not clear how they’re going to solve problems larger in scale than those entities, or longer in tenure than the agents running them. Consider resource depletion: if you imagine for a moment that there is some optimal depletion path, a country can consume its resources faster or slower than that. Too fast, and they get-rich-quick but have a crisis later. Too slow, and they’re self-deprived now. But there are other options: a country can consume its resources quickly now, build weapons, and seize the resources of the slow countries. Also, rapid extraction in some regions drives down prices, creating an impression of abundance, and discouraging other regions from managing the resource more cautiously. The result may be a race for the bottom, rather than evolution of creative long term solutions. Consider also climate: even large emitters have little incentive to reduce, if only their own damages matter. To align self-interest with mitigation, there has to be some kind of external incentive, either imposed top-down or emerging from some mix of bottom-up cooperation and coercion.

If you propose to solve long term global problems only through diversity, evolution, and innovation, you are in effect hoping that those measures will unleash a torrent of innovation that will solve the big problems coincidentally, or that we’ll stay perpetually ahead in the race between growth and side-effects. That could happen, but as the Dartmouth clinic used to say about contraception, “hope is not a method.”

What about the real economy?

I sort of follow a bunch of economics blogs. Naturally they’re all very much preoccupied with the financial crisis. There’s a lot of debate about Keynesian multipliers, whether the stimulus will work, liquidity traps, bursting bubbles, and the like. If you step back, it appears to be a conversation about how to use fiscal and monetary policy to halt a vicious cycle of declining expectations fueled by financial instruments no one really understands – essentially an attempt to keep the perceived economy from dragging down the real economy (as it is clearly now doing). The implicit assumption often seems to be that, if we could only untangle the current mess, the economy would return to its steady state growth path.

What I find interesting is that there’s little mention of what might have been wrong in the real economy to begin with, and its role in the current crisis. Clearly the last decade was a time of disequilibrium, not just in the price of risk, but in the real capital investments and consumption patterns that flowed from it. My working hypothesis is that we were living in a lala land of overconsumption, funded by deficits, sovereign wealth funds, resource drawdown, and failure to invest in our own future. In that case, the question for the real economy is, how much does consumption have to fall to bring things back into balance? My WAG is 15% – which implies a heck of a lot of reallocation of activity in the real economy. What does that look like? Could we see it through the fog of knock-on effects that we’re now experiencing? Is there something we could be doing, on top of fiscal and monetary policy, to ease the transition?

Are We Slaves to Open Loop Theories?

The ongoing bailout/stimulus debate is decidedly Keynesian. Yet Keynes was a halfhearted Keynesian:

US Keynesianism, however, came to mean something different. It was applied to a fiscal revolution, licensing deficit finance to pull the economy out of depression. From the US budget of 1938, this challenged the idea of always balancing the budget, by stressing the need to boost effective demand by stimulating consumption.

None of this was close to what Keynes had said in his General Theory. His emphasis was on investment as the motor of the economy; but influential US Keynesians airily dismissed this as a peculiarity of Keynes. Likewise, his efforts to separate capital projects from ordinary budgets, balanced if possible, found few echoes in Washington, despite frequent mention of his name.

Should this surprise us? It does not appear to have disconcerted Keynes. “Practical men were often the slaves of some defunct economist,” he wrote. By the end of the second world war, Lord Keynes of Tilton was no mere academic scribbler but a policymaker, in a debate dominated by second-hand versions of ideas he had put into circulation in a previous life. He was enough of a pragmatist, and opportunist, not to quibble. After dining with a group of Keynesian economists in Washington, in 1944, Keynes commented: “I was the only non-Keynesian there.”

FT.com, In the long run we are all dependent on Keynes

This got me wondering about the theoretical underpinnings of the stimulus prescription. Economists are talking in the language of the IS/LM model, marginal propensity to consume, multipliers for taxes vs. spending, and so forth. But these are all equilibrium shorthand for dynamic concepts. Surely the talk is founded on dynamic models that close loops between money, expectations and the real economy, and contain an operational representation of money creation and lending?

The trouble is, after a bit of sniffing around, I’m not seeing those models. On the jacket of Dynamic Macroeconomics, James Tobin wrote in 1997:

“Macrodynamics is a venerable and important tradition, which fifty or sixty years ago engaged the best minds of the economics profession: among them Frisch, Tinbergan, Harrod, Hicks, Samuelson, Goodwin. Recently it has been in danger of being swallowed up by rational expectations, moving equilibrium, and dynamic optimization. We can be grateful to the authors of this book for keeping alive the older tradition, while modernizing it in the light of recent developments in techniques of dynamic modeling.”
—James Tobin, Sterling Professor of Economics Emeritus, Yale University

Is dynamic macroeconomics still moribund, supplanted by CGE models (irrelevant to the problem at hand) and black box econometric methods? Someone please point me to the stochastic behavioral disequilibrium nonlinear dynamic macroeconomics literature I’ve missed, so I can sleep tonight knowing that policy is informed by something more than comparative statics.

In the meantime, the most relevant models I’m aware of are in system dynamics, not economics. An interesting option (which you can read and run) is Nathan Forrester’s thesis, A Dynamic Synthesis of Basic Macroeconomic Theory (1982).

Forrester’s model combines Samuelson’s multiplier accelerator, Metzler’s inventory-adjustment model, Hicks’ IS/LM, and the aggregate-supply/aggregate-demand model into a 10th order continuous dynamic model. The model generates an endogenous business cycle (4-year period) as well as a longer (24-year) cycle. The business cycle arises from inventory and employment adjustment, while the long cycle involves multiplier-accelerator and capital stock adjustment mechanisms, involving final demand. Forrester used the model to test a variety of countercyclic economic policies, commonly recommended as antidotes for business cycle swings:

Results of the policy tests explain the apparent discrepancy between policy conclusions based on static and dynamic models. The static results are confirmed by the fact that countercyclic demand-management policies do stabilize the demand-driven [long] cycle. The dynamic results are confirmed by the fact that the same countercyclic policies destabilize the business cycle. (pg. 9)

It’s not clear to me what exactly this kind of counterintuitive behavior might imply for our current situation, but it seems like a bad time to inadvertently destabilize the business cycle through misapplication of simpler models.

It’s unclear to what extent the model applies to our current situation, because it doesn’t include budget constraints for agents, and thus doesn’t include explicit money and debt stocks. While there are reasonable justifications for omitting those features for “normal” conditions, I suspect that since the origin of our current troubles is a debt binge, those justifications don’t apply where we are now in the economy’s state space. If so, then the equilibrium conclusions of the IS/LM model and other simple constructs are even more likely to be wrong.

I presume that the feedback structure needed to get your arms around the problem properly is in Jay Forrester’s System Dynamics National Model, but unfortunately it’s not available for experimentation.

John Sterman’s model of The Energy Transition and the Economy (1981) does have money stocks and debt for households and other sectors. It doesn’t have an operational representation of bank reserves, and it monetizes the deficit, but if one were to repurpose the model a bit (by eliminating the depletion issue, among other things) it might provide an interesting compromise between the two Forrester models above.

I still have a hard time believing that macroeconomics hasn’t trodden some of this fertile ground since the 80s, so I hope someone can comment with a more informed perspective. However, until someone disabuses me of the notion, I have the gnawing suspicion that the models are broken and we’re flying blind. Sure hope there aren’t any mountains in this fog.

Another Bailout MetaRoundup

More good stuff from my blog reader:

Real Time Economics – Secondary Sources: Rebutting Myths, King on Stability, Green Econ & Secondary Sources: Rates, Innovation, Recession, Suprime Lending

Greg Mankiw – More Commentary on the Financial Mess

Economist’s View – links for 2008-10-23 &  links for 2008-10-22

Marginal Revolution – Four myths of the credit crisis, again

Econbrowser – More Unhappy Numbers

Bailout MetaRoundup & Alternatives

MetaRoundup:

Several of the economics blogs I read have had useful roundups of bailout commentary. A few I find found useful:

Do we need to act now? on Economist’s View

9/26 Links on Economist’s View

NYT Economix’ analyst roundup

Greg Mankiw’s roundup of commentary

Update 9/29:

Real Time Economics’ Secondary Sources

Update 10/1: 

Greg Mankiw with more commentary

Alternative Plans:

Economists Against the Paulson Plan

Brad de Long on Krugman on the Dodd plan

WSJ Real Time Economics’ Text of Lawmakers’ Agreement on Principles

Thomas Palley on Saving the Financial System

Marginal Revolution on the Republican plan to rescue mortgages instead of buying mortgage assets

Marginal Revolution with a Modest Proposal (finding and isolating toxic assets)

Update 9/27:

Marginal Revolution with substitute bridges

Greg Mankiw with a letter from Robert Shimer with a nice analysis, including problems with Paulson, the lemons problem, and the Diamond, Kaplan, Kashyap, Rajan & Thaler fix

Update 9/28:

Real Time Economics on securitization

Brad deLong on nationalization (the Swedish model)

Update 9/29:

The Big Picture with Stop Targeting Asset Prices

Marginal Revolution asks, is the Sweden plan better?

The GAO’s Panel of Economists on Climate

I just ran across a May 2008 GAO report, detailing the findings of a panel of economists convened to consider US climate policy. The panel used a modified Delphi method, which can be good or evil. The eighteen panelists are fairly neoclassical, with the exception of Richard Howarth, who speaks the language but doesn’t drink the Kool-aid.

First, it’s interesting what the panelists agree on. All of the panelists supported establishing a price on greenhouse gas emissions, and a majority were fairly certain that there would be a net benefit from doing so. A majority also favored immediate action, regardless of the participation of other countries. The favored immediate action is rather fainthearted, though. One-third favored an initial price range under $10/tonCO2, and only three favored exceeding $20/tonCO. One panelist specified a safety valve price at 55 cents. Maybe the low prices are intended to rise rapidly (or at the interest rate, per Hotelling); otherwise I have a hard time seeing why one would bother with the whole endeavor. It’s quite interesting that panelists generally accept unilateral action, which by itself wouldn’t solve the climate problem. Clearly they are counting on setting an example, with imitation bringing more emissions under control, and perhaps also on first-mover advantages in innovation.

Continue reading