Winner of the New Statesman SPERI Prize in Political Economy 2016


Monday, 16 December 2013

Microfoundations - the illusion of necessary superiority

Tony Yates has written a long discussion of microfounded macro, which I think reflects the views of many academic macroeconomists. I agree with him that microfoundations have done a great deal to advance macroeconomics. It is a progressive research program, and a natural way for macroeconomic theory to develop. That is why I work with DSGE models. Yet in one respect I think he is mistaken, which is in his presumption that microfounded models are naturally superior as a tool for policy actions.

Let me try and paraphrase his discussion. There are basically two ways to model the economy. The first is using microfounded models, which when done right avoid the Lucas critique. The second is to do something like a VAR, which lets the data speak, although doing policy with a VAR is problematic.

So what about aggregate models that use a bit of theory and a bit of econometrics? Let me quote.

A final possibility is that there is no alternative but to proceed in non-micro-founded way.  Yet some business has to be done – some policy decision, or some investment based on a forecast.  In these circumstances, it’s ok to take a stab at the decision rules or laws of motions for aggregates in an economy might look like if you could micro-found what you are concerned with, and move on.  Perhaps doing so will shed light on how to do it properly.  Or at least give you some insight into how to set policy.  Actually many so called microfounded models probably only have this status;  guesses at what something would look like if only you could do it properly.”

As the language makes clear, we are talking at least second best here. Tony would not go so far as to outlaw non-microfounded models, but any such models are clearly inferior to doing “it properly”.

Yet what is the basis of this claim? A model should be as good a representation of the economy as possible for the task in hand. The modeller has two sources of information to help them: micro theory about how individuals may behave, and statistical evidence about how the aggregate economy has behaved in the past. Ideally we would want to exhaust both sets of information in building our model, but our modelling abilities are just not good enough. There is a lot in the data that we cannot explain using micro theory.

Given this, we have three alternatives. We can focus on microfoundations. We can focus on the data. Or we can do something in between - let me call this the eclectic approach. We can have an aggregate model where equation specification owes something to theory, but also attempts to track more of the data than any microfounded model would try to do. I can see absolutely no reason why taking this eclectic approach should produce a worse representation of the economy than the other two, whether your business is policy or forecasting.

Let’s take a very basic example. Suppose in the real world some consumers are credit constrained, while others are infinitely lived intertemporal optimisers. A microfoundation modeller assumes that all consumers are the latter. An eclectic modeller, on finding that consumption shows excess sensitivity to changes in current income, adds a term in current income into their aggregate consumption function, which otherwise follows the microfoundations specification. Which specification will perform better? We cannot know for sure, but I see circumstances in which the ‘ad hoc’ eclectic specification would do better than the misspecified microfounded model. (I give a more sophisticated example related to Friedman’s PIH and precautionary saving here.)

Now a microfoundation modeller might respond that the right thing to do in these circumstances is to microfound these credit constrained consumers. But that just misses the point. We are not talking about research programmes, but particular models at particular points in time. At any particular time, even the best available microfounded model will be misspecified, and an eclectic approach that uses information provided by the data alongside some theory may pick up these misspecifications, and therefore do better.

Another response might be that we know for sure that the eclectic model will be wrong, because (for example) it will fail the Lucas critique. More generally, it will not be internally consistent. But we also know that the microfounded model will be wrong, because it will not have the right microfoundations. The eclectic model may be subject to the Lucas critique, but it may also - by taking more account of the data than the microfounded model - avoid some of the specification errors of the microfounded model. There is no way of knowing which errors matter more.

It’s easy to see why eclectic models have a hard time. Because they look at both theory and data, they will never satisfy both theorists and econometricians. But that does not make them necessarily inferior to either microfounded models or VARs. We can speculate on reasons why, on at least some occasions, eclectic models may do better. But the key point I want to make here is that I do not know of any epistemological reasons for thinking eclectic models must be inferior to microfounded models, yet many macroeconomists seem to believe that they are.


27 comments:

  1. Coming from the world of physics, I think it is patently absurd that there are people who think that "microfounded" models are inherently superior.

    In physics, we have the benefit of knowing the "microfoundations" to an absurd degree of accuracy (quantum electrodynamics is known to be accurate to some 14 significant figures, for instance). But there are many active areas of research in physics where it is fundamentally impossible to apply these "microfoundations" in their full glory: the computational requirements are just too great.

    And so actual physicists routinely make use of a combination of approximations and ad-hoc fudge factors to sort of paste over the deficiencies in being unable to model the underlying physics. If you tried to tell a physicist that modeling a superconductor using no quantum mechanics was superior to using fudge-factors and approximations of the quantum-mechanical behavior, because the description without using quantum mechanics can actually be computed, they'd look at you like you were a crazy person. And rightly so: It's not possible to produce superconducting behavior at all without taking into account quantum effects.

    So when I hear about economists who think that a microfounded model is necessarily better, I cannot help but think that they are insane. They're attempting to model an exceedingly complex system, one which is known to play by at least slightly different rules than those of any known microfoundations. Those models are, by their very nature, approximations. And it's only sensible to use the data to try to correct for the inevitable errors in those approximations, even if it's not clear from our current understanding of the microfoundations just what those fudge factors represent.

    I'm certain that a more ad-hoc approach can be done poorly. But using microfoundations and refusing to use ad-hoc corrections, is certainly going to produce an incorrect result under some situations (sometimes wildly incorrect). Much better to make use of a combination of theory and data to try to edge towards the correct answer, and make the fact that the true microfoundations are actually unknown absolutely central to your modeling so as to avoid becoming overconfident in the model's applicability.

    ReplyDelete
    Replies
    1. Jason,

      well maybe in physicists thought a bit more carefully about what economists are trying to do, as opposed to what physicists do, they'd understand the motivation for microfoundations a little better.

      read this:
      http://worthwhile.typepad.com/worthwhile_canadian_initi/2013/12/microfoundations-for-peoples-sake.html

      Delete
    2. @Luis

      Is Nick really talking about microfoundations? Or is he talking about mechanisms?

      He says microfoundations, but I think he really wants a clear description of the underlying mechanism that is moving the system to or away from an equilibrium. The lack of mechanisms is an endemic problem for DSGEs. I recall Matt Rognlie describing Smets-Wouters(US) as ( from memory) "more an arbitrary decomposition into shocks of dubious interpretation, than a model."

      Unfortunately, this description would be true for virtually every DSGE model.

      Delete
    3. To reinforce Jason Dick's point - Boyle's law - the phenomenological law relating pressure and volume of a gas - was stated in 1662. It took almost 100 years to for the correct micro-foundations to be proposed, and another almost 100 years for it to be accepted. The progress of physics would be very slow indeed if we waited for micro-foundational explanations.

      Delete
  2. I echo Jason Dick's comment above. One thing that stood out here was this:

    "The modeller has two sources of information to help them: micro theory about how individuals may behave"

    Reductionism aside, this seems to be a major problem with microfoundations as I see them. We have theories which posit that people's behaviour can be modeled sufficiently well by sets of equations. But do these equations really represent the 'micro-behaviour' of economic agents, or are they in many ways just as arbitrary as ad-hoc "fudges"?

    Furthermore, why are these relationships themselves not subject to the Lucas Critique? Is it really something we can ever truly overcome, or do we just constantly need to look out for the changing relationships between policy, theory and reality?

    ReplyDelete
  3. SW-L:
    "At any particular time, even the best available microfounded model will be misspecified, and an eclectic approach that uses information provided by the data alongside some theory may pick up these misspecifications, and therefore do better."

    "But the key point I want to make here is that I do not know of any epistemological reasons for thinking eclectic models must be inferior to microfounded models, yet many macroeconomists seem to believe that they are."

    Well put.
    I'll also point out that many such economists are also committed to axiomatic theorising and are disdainful of inductive reasoning. They also insist on restricting allowable data to almost nothing other than time-series of P and Q, so if your eclectic approach requires some other form of data, it will be disallowed.

    ReplyDelete
  4. It is my opinion that both eclectic models and microfounded models, as currently formulated, are always prone to failure because they do not adequately account for the huge inertia that is associated with most disturbances to the economy. The time between the cause of the input disturbance (e.g. interest rate change, tax rate change) and its final effect on say unemployment, is a matter of many months and sometimes years. If the modelling of these (many) inertial effects is approached in the correct manner, as is done for example in the model featured in website www.economyuk.co.uk. then a great deal of the fog currently enveloping macro-economic modelling simply disperses.

    ReplyDelete
  5. But we also know that the microfounded model will be wrong, because it will not have the right microfoundations. The eclectic model may be subject to the Lucas critique, but it may also - by taking more account of the data than the microfounded model - avoid some of the specification errors of the microfounded model. There is no way of knowing which errors matter more.

    YES. YES. A THOUSAND TIMES YES. YESTHOUSAND.

    ReplyDelete
    Replies
    1. this point could also be made about empirically informed "verbal reasoning" versus maths. Tony addresses Adam Posen, whom I presume uses microfoundations in the sense Nick Rowe uses here, just not with maths.

      It could be that maths is a straitjacket and that less formal reasoning is more flexible. Yes, maths helps avoid certain varieties of errors, but perhaps it introduces another kind of error by limiting what we are able to embody in maths, and again it's not obvious which errors matter more.

      personally, these thoughts do not make me want to abandon microfoundations+maths, merely to keep an open mind about other approaches.

      Delete
    2. I don't think anyone is talking about doing away with maths. It seems to me to be a call for broadening the mathematical toolkit and complementing formal models with less formal methods of reasoning. And in empirical work, moving beyond the statistical properties of time series to also allow other types of data. Bob Solow has been calling for this for ages :

      "No one could be against time-series econometrics. When we need estimates of parameters, for prediction or policy analysis, there is no good alternative to the specification and estimation of a model. To leave it at that, however, to believe as many American economists do that empirical economics begins and ends with time series analysis, is to ignore a lot of valuable information that can not be put into so convenient a form. I include the sort of information that is encapsulated in the qualitative inferences made by expert observers, as well as direct knowledge of the functioning of economic institutions. Skepticism is always in order, of course. Insiders are sometimes the slaves of silly ideas. But we are not so well off for evidence that we can afford to ignore everything but time series of prices and quantities."
      ----Robert Solow (Nobel Lecture)

      An example from Lucas on the importance of more eclectic modelling and data. Note Lucas's *reason* for dismissing the possibility of a recession in 2007

      Robert Lucas (Sept 2007):
      "I am skeptical about the argument that the subprime mortgage problem will contaminate the whole mortgage market, that housing construction will come to a halt, and that the economy will slip into a recession. Every step in this chain is questionable and none has been quantified."

      Delete
    3. Herman,

      maybe so, but I was channelling Adam Posen, whose claim math+microfounded models have "no merit" is I think a call to abandon them

      Delete
    4. I don't think anyone is talking about doing away with maths.

      I think we can do away with a hell of a lot of it. For sure the mathematicisation of the discipline is right there at the centre of the problem.

      Delete
  6. David Dreyer Lassen17 December 2013 at 08:40

    SW-L: "Let’s take a very basic example. Suppose in the real world some consumers are credit constrained, while others are infinitely lived intertemporal optimisers. A microfoundation modeller assumes that all consumers are the latter. An eclectic modeller, on finding that consumption shows excess sensitivity to changes in current income, adds a term in current income into their aggregate consumption function, which otherwise follows the microfoundations specification."

    Why would a microfoundation modeller necessarily assume that all consumers are infinitely lived intertemporal optimizers? (and even if they are, such consumers can presumably also be credit constrained, since life spans and objective functions tell us nothing about the credit market). There are plenty of models out there with credit/liquidity constraints. For a simple example of a macro model with such heterogeneity, see Mankiw's AER P&P where he combines life cycle people and hand-to-mouth consumers. For empirical evidence on such heterogeneity, see Kreiner, Lassen and Leth-Petersen, CEPR DP 2012.

    ReplyDelete
    Replies
    1. I think you miss the point here. This was an example of how, by looking at the data, we can make ad hoc corrections to a microfounded model that can improve that model. Now of course macroeconomists will then want to microfound those improvements, but this is a never ending process. At any particular time, we can take the best (but misspecified) microfounded model, and potentially improve it by making ad hoc modifications suggested by the data. Discovering how to build a better microfounded model may take a long time.

      If you are still not convinced, look at the Carroll paper I discuss in the penultimate paragraph of this post: http://mainlymacro.blogspot.co.uk/2013/05/data-theory-and-central-bank-models.html. We are only just beginning to microfound precautionary saving, but Carroll suggests Friedman's PIH captured the key points decades ago.

      Delete
  7. I have a question about the statement: "There are basically two ways to model the economy. The first is using microfounded models, which when done right avoid the Lucas critique. The second is to do something like a VAR, which lets the data speak...."

    How are you planning on microfounding changing preferences over time?

    Putting aside whether people are rational in their decisions, I see a gigantic catch-all category at the foundation of microeconomic theory that we are assuming never moves or moves in an incredibly simplistic way. I guess I just don't see why a theory grounded on such shaky foundations would necessarily be better than all other theories.

    ReplyDelete
  8. 2003 Robert Lucas, “[the] central problem of depression-prevention has been solved, for all practical purposes, and has in fact been solved for many decades.”

    Micro is looking very Hobbesian in its form of rationality.

    ReplyDelete
  9. Read Ronald Coase on microeconomics. Is this what you think you should micro-found on?

    "This separation of economics from the working economy has severely damaged both the business community and the academic discipline."

    http://ineteconomics.org/blog/institute/saving-economics-economists-tribute-late-ronald-coase

    Time to forget microfounding, and start looking outside the economics profession for answers about how people and societies and its institutions actually behave.

    ReplyDelete
  10. Maybe the right question is how closely the models predict reality and how robust the predictions are.

    If any of them performed well, it wouldn't matter if they were based on spherical frictionless cows or epicycles.

    Instead, we have people arguing that QE causes deflation, hyperinflation, and neither, and since none of the models work very well, they can choose their preferred model because it supports their value system and political beliefs, and spend their time debating how many angels can dance on the head of a pin.

    ReplyDelete
  11. Look at option (and similar) valuations or the whole Beta stuff.
    It is clear that sometimes the past gives a very good representation, but other times it differs.
    There is not a better system it depends on the situation.

    Anyway there are a lot of other possibilities.
    Cut Macro in smaller parts for instance.

    History usually works when the situation is more or less normal. No extraordinary funny stuff. Which is one of the main issues/questions now.

    ReplyDelete
  12. Given that the economy is a collection of individual units I can see the need for an explanation of macroeconomic systems to be founded in micro behaviour, but I have never heard a respectable explanation for why that needs to be expressed in equations. Equations make economics hard to write and tedious to follow, not least because different authors may use different symbols for any given variable, and severely limits the range of people who can evaluate its reasonableness, especially when it comes to the individuals actually making the economic decisions. If the resulting model yields no analytic solution, reverting to some numerical solution (like log-linearising around the steady state) must squander an unknown amount of the economic intuition represented in the basic equations. Equation-based explanations divert teaching into mathematical tricks that add nothing to the students' understanding. And equations are no more rigorous than logical mechanistic description - as I used to say to my academic colleagues, if you were on trial for a crime you did not commit, you would hope that the process was very rigorous, but your heart would sink if your lawyer presented an argument in equations as to why you had no motive to commit the crime.

    As far as I can see, the role for equations is when the parameters can be known well enough to draw quantitative conclusions.

    I suspect that economics has got itself stuck in this wasteful method of enquiry by an emperor's new clothes dynamic - to reject it is to declare yourself incapable such that your opinion on the matter is to be discounted.

    ReplyDelete
  13. Response up:

    http://noahpinionblog.blogspot.com/2013/12/i-love-microfoundations-just-not-yours.html

    ReplyDelete
  14. I am not at all expert on this. But there seem to be two very large problems with the micro-foundations idea..

    First, the assumption is that the micro-foundations of economics are generally correct. That is, we can credibly model the behaviour of individuals as utility maximisers in a deterministic manner, in all relevant circumstances. I no longer believe this.

    Second, the assumption is that we can aggregate from the individual "atoms" to the behaviour of the entire complex economic system. But, as the physicist above argues, that is impossible for complex physical systems. Why should it be possible for equally complex economic ones?

    These are not debating points. I would really like to know the answers.

    ReplyDelete
    Replies
    1. On the first point, I agree that microfoundations in macro very often means rather basic (econ 101) micro, and in that sense it is rather old fashioned. In other words we do not see enough of behavioural economics, such as in Akerlof and Shiller's Animal Spirits for example. Is that what you had in mind?

      On the second, I am less convinced that this is a major problem. I cannot think of many well established aggregate macro relationships which cannot in principle be explained by thinking about micro behaviour. An exception might be short term interactions in financial markets, where network analysis along the lines that Haldane has been working on looks very interesting. However, with advances in computing power, we are beginning to see more analysis of very heterogeneous systems, and it will be interesting if these start generating alternative explanations of aggregate relationships that cannot be explained by simpler, representative agent set-ups. They have not yet.

      However, I included both of your doubts, and more, in this post here (http://mainlymacro.blogspot.co.uk/2012/08/arguments-for-ending-microfoundations.html), which argued that it was a mistake to put all our eggs in the microfoundations basket. In the 1970s and 80s there was a rich tradition of empirical macro modelling in the UK, of which Ken Wallis's ESRC Macromodelling Bureau was a key part, but that was killed off by the more fashionable DSGE approach. I think this was a shame, because I think both approaches to macro are worth pursuing.

      Delete
    2. On the first point, yes, that is what I had in mind. To me, the biggest point on microfoundations is that people simply cannot compute the macroeconomy: it is too profoundly uncertain and indeed incomprehensible. This, of course, is why Keynes introduced the notion of "animal spirits" to which you refer.Thus, the notion of utility maximising choice in this setting just makes no sense to me.

      On the second point, the financial sector is a very big exception to the representative agent formulation, since it drives just about everything in the modern economy.

      Delete
  15. Your "eclectic approach" is known as "Data-Based Mechanistic" modelling in hydrology (although the theory is quite general) . I recommend the papers by Peter Young.

    ReplyDelete
  16. Sahibabad industrial area infrastructure is so nicely groomed that you can never stuck inside , you will always find a road connected to main road.
    http://propertydealerontips.com/Property+Dealers-in-Sahibabad+Industrial+Area-Ghaziabad-65556555

    ReplyDelete

Unfortunately because of spam with embedded links (which then flag up warnings about the whole site on some browsers), I have to personally moderate all comments. As a result, your comment may not appear for some time. In addition, I cannot publish comments with links to websites because it takes too much time to check whether these sites are legitimate.