When David Vines
asked me to contribute to a OXREP (Oxford Review of Economic Policy)
issue
on “Rebuilding Macroeconomic Theory”, I think what he hoped I
would write on how the core macro model needed to change to reflect
macro developments since the crisis with a particular eye to
modelling the impact of fiscal policy. That would be an interesting
paper to write, but I decided fairly quickly that I wanted to say
something that I thought was much more important.
In my view the
biggest obstacle to the advance of macroeconomics is the hegemony of
microfoundations. I wanted at least one of the papers in the
collection to question this hegemony. It turned out that I was not
alone, and a few papers did the same. I was particularly encouraged
when Olivier Blanchard, in blog posts reflecting his thoughts before
writing his contribution, was thinking along the same lines.
I will talk about
the other papers when more people have had a chance to read them.
Here I will focus on my own contribution. I have been pushing a
similar line in blog posts for some time, and that experience
suggests to me that most macroeconomists working within the hegemony
have a simple mental block when they think about alternative
modelling approaches. Let me see if I can break that block here.
Imagine a DSGE
model, ‘estimated’ by Baynesian techniques. To be specific,
suppose it contains a standard intertemporal consumption function.
Now suppose someone adds a term into the model, say unemployment into
the consumption function, and thereby significantly improves the fit
of the model. It is not hard to think why the fit significantly
improves: unemployment could be a proxy for the uncertainty of labour
income, for example. The key question becomes which is the better
model with which to examine macroeconomic policy: the DSGE or the
augmented model?
A microfoundations
macroeconomist will tend to say without doubt the original DSGE
model, because only that model is known to be theoretically
consistent. (They might instead say that only that model satisfies
the Lucas critique, but internal consistency is the more general
concept.) But an equally valid response is to say that the original
DSGE model will give incorrect policy responses because it misses an
important link between unemployment and consumption, and so the
augmented model is preferred.
There is absolutely
nothing that says that internal consistency is more important than
(relative) misspecification. In my experience, when confronted with
this fact, some DSGE modellers resort to two diversionary tactics.
The first, which is to say that all models are misspecified, is not
worthy of discussion. The second is that neither model is
satisfactory, and research is needed to incorporate the unemployment
effect in a consistent way.
I have no problem
with that response in itself, and for that reason I have no problem
with the microfoundations project as one way to do
macroeconomic modelling. But in this particular context it is a
dodge. There will never be, at least in my lifetime, a DSGE model
that cannot be improved by adding plausible but potentially
inconsistent effects like unemployment influencing consumption. Which
means that, if you think models that are significantly better at
fitting the data are to be preferred to the DSGE models from whence
they came, then these augmented models will always beat the DSGE model
as a way of modelling policy.
What this question
tells you is that there is an alternative methodology for building
macroeconomic models that is not inferior to the microfoundations
approach. This starts with some theoretical specification, which
could be a DSGE model as in the example, and then extends it in ways
that are theoretically plausible and which also significantly improve
the model’s fit, but which are not formally derived from
micofoundations. I call that an example within the Structural
Econometric Model (SEM) class, and Blanchard calls it a Policy Model.
An important point I
make in my paper is that these are not competing methodologies, but
instead they are complementary. SEMs as I describe them here start
from microfounded theory. (Of course SEMs can also start from
non-microfounded theory, but the pros and cons of that is a different
debate I want to avoid here.) As a finished product they provide many
research agendas for microfoundation modelling. So DSGE modelling can
provide the starting point for builders of SEMs or Policy Models, and
these models when completed provide a research agenda for DSGE
modellers.
Once you see this
complementarity, you can see why I think macroeconomics would develop
much more rapidly if academics were involved in building SEMs as well
as building DSGE models. The mistake the New Classical Counter
Revolution made was to dismiss previous ways of modelling the
economy, instead of augmenting these ways with additional approaches.
Each methodology on its own will develop much more slowly than the
two combined. Another way of putting it is that research based on SEMs is more efficient than the puzzle resolution approach used today.
In the paper, I try
to imagine what would have happened if the microfoundations project
had just augmented the macroeconomics of the time (which was SEM
modelling), rather than dismissing it out of hand. I think we have
good evidence that active complementarity between SEM and
microfoundations modelling would have investigated in depth links
between the financial and real sectors before the financial
crisis. The microfoundations hegemony chose the wrong puzzles to look at, deflecting macroeconomics from the more important empirical issues. The same thing may happen again if the microfoundations hegemony continues.
Interesting.But this is too accommodating of DSGE models for me. Ultimately models have to be judged against experience and if they systematically get things wrong then they need to be junked. Models don't have to predict everything, but they should explain the systematic regularities of the set of variables they seek to understand (choice informed by theory). The process described by David Hendry or Aris Spanos seems to me to make sense. First you look for a statistically adequate model, which I understand to be one which does not lead to systematic errors, which may not be the model with the best fit as measured by R2 type measures. This model provides a concise summary of what needs to be expalined - it's a way of summarising the data in analytical form, allowing us a clear overview of empirical regularities. Then you use theory to see if you can simplify the model without losing substantial explanatory power. The model is provisional, as all scientific theories must be, and may break down. So we investigate further and we may find a more general explanation, or a missing variable which we hadn't picked up before because it hadn't varied enough before to matter. This seems to me like empirical science. Starting from the assumption that everything can be explained by individual rational maximizing behaviour is either vacuous (you can always stick in a few more constraints on knowledge or something) or an assumption needing empirical justification. Sometimes it seems like pre-Enlightenment arguments that things must be just so because they follow from God's perfection or Aristotelian inherent capacities. Interested in your views on Hendry.
ReplyDelete"These are not competing methodologies, but instead they are complimentary." Your first espresso cupertino of the year?
ReplyDeleteWishing your blog many complements and compliments in 2018.
Thanks. At least I got it right on the second attempt.
DeleteThank you for this thought-provoking post. To conscientious people, consistency sounds like a good thing. It implies rigour, discipline, restraint. A bit like a defence against populism.
ReplyDeleteFor economists of a scientific mind, it is irresistibly attractive. Building a theory consistent with foundations makes the resulting edifice feel sound.
But so it did to logicians and mathematicians in the first decades of the twentieth century. Until Kurt Gödel showed that there were limits to founding a consistent system on axioms - truth and provability were not the same thing. Some mathematicians fell to pieces at the result, most moved on and now have a better understanding of what they are trying to do.
Why do people insist on microfoundation consistency in the first place ? In spite of what I have just written, logic has little to do with it. They insist on this because it gives the answer they want. It’s consistency with the kind of micrcofoundations they like, not those of the real world or even an approximation to it. The ones that can be captured by those in power to justify the sort of economy that we have ended up with.
Some of these foundations are right or so nearly right as to be useful. But if we had to explain some of this to the general public, they would be aghast - free disposal of goods, no real explanation of money, insurance or banks, no asymmetric information….
It might be tempting to turn the tables and ask for more rigorous foundations of the micrcofoundations themselves - based on psychological research or behavioural observations. And why not ? But there would still be absolute limits to the pursuit of consistency.
Has economics had its Gödel moment ? Maybe not quite. But if it moved in a more approximative and empiricist direction, as you suggest, then it would begin to learn some of the same lessons.
What if someone came up with microfoundations that yielded Keynesian macroeconomic conclusions? Would you still castigate it as microfoundations hegemony? Or are your objections to microfoundations based on the fact that they are mainly used by neoclassicals to come up with anti-Keynesian conclusions?
ReplyDeleteThey did: its called New Keynesian economics: now the dominant paradigm in business cycle analysis.
DeleteIf the macroeconomics does not include involuntary unemployment it can hardly be called Keynesian.
DeleteIt seems the mainstream economics continually wants to look at things in a Classical (or in this case New-Classical) framework. Why do you think that is the case? Can you see why some people see this as religion (trying to explain evolution using the framework of Creationism)? Classical economics, at its roots, is fundamentally ideological. Just accept that as it is. Ad hoc frictions added to such a framework is not a proper way of doing analysis. Keynes explained why we get a permanent state of unemployment. In the end he had to throw away the classical framework to do so and looked at the actual reasons why. And this is the way it should be done.
DeleteNK.
Interesting - and I think there's a similar relationship between Health Economics and Health Services Research (as well as the biological sciences and epidemiology you mention in a previous blog post).
ReplyDeleteHSR papers are normally far easier to translate into policy questions, or easier to apply to a policy question.However, from my experience HSR researchers can restrict themselves to modelling the characteristics they can easily observe in data. Translating your thinking to this context, Health Economists working more theoretically could have two benefits; to provide a foundation for HSR models, as you state, but also to push HSR researchers to study characteristics that are theoretically important, but might be more difficult to observe and quantify.
Non-economist again. Methodologically, the idea of getting a better "fit" by adding an additional component to the model is fraught with problems. Maybe this was just shorthand in a blog post, but without validation of the model against completely separate data sets it can easily produce garbage.
ReplyDelete"An important point I make in my paper is that these are not competing methodologies, but instead they are complementary."
ReplyDeleteAn important point I make in my paper is that these are not competing methodologies; instead, they are complementary.
There. All better.
Perhaps I'm not reading this post correctly, or I am not sufficiently well versed in economics, but it seems to me that the complaint from the microfoundations camp is not that DES models are known to be inconsistent in certain boundary cases, but that they haven't been proven to be consistent. If this is the case, doesn't this make them insist on somewhat more than mathematical rigor? If so, I'd suggest that perhaps the fundamental objection then is not lack of rigor.
ReplyDeleteMacro for retarded economists
ReplyDelete“Since every act of spending results in income for somebody else, total spending for the economy as a whole equals total income. This is true by definition and is a basic building block in macroeconomics.” (Cooper)
Both, orthodox and heterodox economists subscribe to this statement as the self-evident rock-bottom truth of all of economics. Too bad that this statement is materially/logically false.
The foundational error/mistake/blunder consists in the methodological fact that the two most important magnitudes of economics — profit and income — are ill-defined.#1 In order to see this one has to go back to the most elementary configuration, that is, the pure production-consumption economy which consists of the household and the business sector.#2
In this elementary economy, three configurations are logically possible: (i) consumption expenditures are equal to wage income C=Yw, (ii) C is less than Yw, (iii) C is greater than Yw.
• In case (i) the monetary saving of the household sector Sm≡Yw−C is zero and the monetary profit of the business sector Qm≡C−Yw, too, is zero. The product market is cleared, i.e. X=O in all three cases.
• In case (ii) monetary saving Sm is positive and the business sector makes a loss, i.e. Qm is negative.
• In case (iii) monetary saving Sm is negative, i.e. the household sector dissaves, and the business sector makes a profit, i.e. Qm is positive.
It always holds Qm+Sm=0 or Qm=−Sm, in other words, at the heart of the monetary circuit is an identity: the business sector’s deficit (surplus) equals the household sector’s surplus (deficit). Put bluntly, loss is the counterpart of saving and profit is the counterpart of dissaving. This is the most elementary form of the macroeconomic Profit Law. It follows directly from the profit definition and the definition of household sector saving.
Loss or profit are NOT income. Only distributed profit is income. The profit theory is false since Adam Smith.#3 As a collateral damage, all I=S or IS-LM models are false.
Economists are too stupid for the elementary mathematics of accounting.#4 The statement total income equals total spending is simply false because of the all-important phenomenon of credit. Equipped with credit the household sector can spend MORE than its period income (= dissaving) or in the opposite case LESS (= saving). Total spending and total income are NEVER equal, the foundational intuition of macroeconomics is false ― and so is all the rest. Macroeconomics is dead since Keynes.#5
Egmont Kakarot-Handtke
#1 For details see ‘How the Intelligent Non-Economist Can Refute Every Economist Hands Down’
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2705395
and ‘Keynes’ Missing Axioms’ Sec. 14-18
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1841408
#2 The elementary production-consumption economy is given by three macro axioms: (A1) Yw=WL wage income Yw is equal to wage rate W times working hours. L, (A2) O=RL output O is equal to productivity R times working hours L, (A3) C=PX consumption expenditure C is equal to price P times quantity bought/sold X.
#3 See ‘Essentials of Constructive Heterodoxy: Profit’
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2575110
and cross-references Profit
http://axecorg.blogspot.de/2015/03/profit-cross-references.html
#4 See ‘The Common Error of Common Sense: An Essential Rectification of the Accounting Approach’
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2124415
#5 How Keynes got macro wrong and Allais got it right
https://axecorg.blogspot.de/2016/09/how-keynes-got-macro-wrong-and-allais.html