Sunday, 29 December 2013

Werning on Liquidity Trap Policy

For macroeconomists

I finally got round to reading this paper by Iván Werning - Managing a Liquidity Trap: Monetary and Fiscal Policy. It takes the canonical New Keynesian model, puts it into continuous time, and looks at optimal monetary and fiscal policy when there is a liquidity trap. (To be precise: a period where real interest rates are above their natural level because nominal interest rates cannot be negative). I would say it clarifies rather than overturns what we already know, but I found some of the clarifications rather interesting. Here are just two.


1) Monetary policy alone. The optimum commitment (Krugman/Eggertsson and Woodford) [1] policy of creating a boom after the liquidity trap period might (or might not) generate a path for inflation where inflation is always above target (taken as zero). Here is a picture from the paper, where the output gap is on the vertical axis and inflation the horizontal, and we plot the economy through time. The black dots are the economy under optimal discretionary policy, and the blue under commitment, and in both cases the economy ends up at the bliss point of a zero gap and zero inflation. 


In this experiment real interest rates are above their natural level (i.e. the liquidity trap lasts) for T periods, and everything after this shock is known. Under discretionary policy, both output and inflation are too low for as long as the liquidity trap lasts. In this case output starts off 11% below its natural level, and inflation about 5% below. The optimal commitment policy creates a positive output gap after the liquidity trap period (after T). Inflation in the NK Phillips curve is just the integral of future output gaps, so inflation could be positive immediately after the shock: here it happens to be zero. As we move forward in time some of the negative output gaps disappear from the integral, and so inflation rises.

It makes sense, as Werning suggests, to focus on the output gap. Think of the causality involved, which goes: real rates - output gap (with forward integration) - inflation (with forward integration), which then feedback on to real rates. Optimum policy must involve an initial negative output gap for sure, followed by a positive output gap, but inflation need not necessarily be negative at any point.

There are other consequences. Although the optimal commitment policy involves creating a positive output gap in the future, which implies keeping real interest rates below their natural level for a period after T, as inflation is higher so could nominal rates be higher. As a result, at any point in time the nominal rate on a sufficiently long term bond could also be higher (page 16).

2) Adding fiscal policy. The paper considers adding government spending as a fiscal instrument. It makes an interesting distinction between ‘opportunistic’ and ‘stimulus’ changes in government spending, but I do not think I need that for what follows, so hopefully it will be for a later post. What I had not taken on board is that the optimal path for government spending might involve a prolonged period where government spending is lower (below its natural level). Here is another picture from the paper.


The blue line is the optimal commitment policy without any fiscal action: the same pattern as in the previous figure. The red line is the path for output and inflation with optimal government spending, and the green line is the path for the consumption gap rather than the output gap in that second case. The vertical difference between red and green is what is happening to government spending.

The first point is that using fiscal policy leads to a distinct improvement. We need much less excess inflation, and the output gap is always smaller. The second is that although initially government spending is positive, it becomes negative when the output gap is itself positive i.e. beyond T. Why is this?

Our initial intuition might be that government spending should just ‘plug the gap’ generated by the liquidity trap, giving us a zero output gap throughout. Then there would be no need for an expansionary monetary policy after the gap - fiscal policy could completely stabilise the economy during the liquidity gap period. This will give us declining government spending, because the gap itself declines. (Even if the real interest rate is too high by a constant amount in the liquidity trap, consumption cumulates this forward.)

This intuition is not correct partly because using the government spending instrument has costs: we move away from the optimal allocation of public goods. So fiscal policy does not dominate (eliminate the need for) the Krugman/ Eggertsson and Woodford monetary policy, and optimal policy will involve a mixture of the two. That in turn means we will still get, under an optimal commitment policy, a period after the liquidity trap when there will be a positive consumption gap.

The benefit of the positive consumption gap after the liquidity trap, and the associated lower real rate, is that it raises consumption in the liquidity gap period compared to what it might otherwise have been. The cost is higher inflation in the post liquidity trap period. But inflation depends on the output gap, not just the consumption gap. So we can improve the trade-off by lowering government spending in the post liquidity trap period.

Two final points on what the paper reaffirms. First, even with the most optimistic (commitment) monetary policy, fiscal policy has an important role in a liquidity trap. Those who still believe that monetary activism is all you need in a liquidity trap must be using a different framework. Second, the gains to trying to implement something like the commitment policy are large. Yet everywhere monetary policy seems to be trying to follow the discretionary rather than commitment policy: there is no discussion of allowing the output gap to become positive once the liquidity trap is over, and rules that might mimic the commitment policy are off the table. [2] I wonder if macroeconomists in 20 years time will look back on this period with the same bewilderment that we now look back on monetary policy in the early 1930s or 1970s? 


[1] Krugman, Paul. 1998. “It’s Baaack! Japan’s Slump and the Return of the Liquidity Trap.” BPEA, 2:1998, 137–87. Gauti B. Eggertsson & Michael Woodford, 2003. "The Zero Bound on Interest Rates and Optimal Monetary Policy,"Brookings Papers on Economic Activity, Economic Studies Program, The Brookings Institution, vol. 34(1), pages 139-235.

[2] Allowing inflation to rise a little bit above target while the output gap is still negative is quite consistent with following a discretionary policy. I think some people believe that monetary policy in the US might be secretly intending to follow the Krugman/Eggertsson and Woodford strategy, but as the whole point about this strategy is to influence expectations, keeping it secret would be worse than pointless.



Saturday, 28 December 2013

UK Flooding: another austerity Christmas present

The big news in the UK over the Christmas period has been flooding caused by heavy rain. The Prime Minister naturally toured some of the worst affected areas, but the reaction he got was not what he might have hoped in terms of media coverage. Was this hostility fair? Here are some facts. (Source (pdf): Flood defences in England, House of Commons Library, SN/SC/5755.) 


Until 2010, flood defence spending by the government had been steadily increasing: between 1997 and 2010 spending increased by 75% in real terms. There are good reasons why spending should be increasing. One is that climate change is likely to substantially increase the chances of periods of severe rainfall. Flood damage currently costs over £1 billion a year, but the Environment Agency has estimated this figure could rise to £27 billion by 2080.

When the current government came to power, their 2010 comprehensive spending review reduced spending by 20% in real terms, according to the Committee on Climate Change. Following floods in 2012 the government provided a small amount of additional money - shown in purple on the chart. So instead of continuing to raise spending to deal with a growing threat, the government cut back spending as part of their austerity programme.

It is a distraction to try and link specific episodes of flooding to spending cutbacks. These things work on probabilities. It is also a distraction to obsess about whether spending has gone up or down in real terms. The government will claim that spending on ‘frontline’ defences has not fallen because of ‘efficiency savings’ elsewhere and partnerships with local authorities, but the real point is this. The recession presented the government with a huge opportunity, to bring forward the many existing plans to enhance the UK’s flood defences at a time when labour was cheap and borrowing costs very low. They chose not to take advantage of that opportunity, ostensibly because of a potential debt crisis but in reality because of an ideological distaste for public spending. Over the next decade or two, many people will pay the price for that decision, either directly as their homes and businesses are flooded, or indirectly through higher insurance premiums.

Postscript - see also this later post

Sunday, 22 December 2013

Some notes on the UK recovery

The latest national accounts data we have is for 2013 Q3. Between 2012Q4 and 2013Q3 real GDP increased by 2.1% (actual, not annual rate). Not a great number, but it represented three continuous quarters of solid growth, which we had not seen since 2007. So where did this growth come from? The good news is that investment over that same period rose by 4%. (This and all subsequent figures are the actual 2013Q3/2012Q4 percentage growth rate.) Business investment increased (2.7%), public investment did not (0.5%), but dwellings investment rose by 8%. The bad news is that exports rose by only 0.1%. Government consumption increased by 1.0%.


Over half of the increase in GDP was down to a 1.8% rise in consumption. Not huge, but significant because it represented a large fall in the savings ratio, as this chart shows.


The large increase in saving since 2009 is a major factor behind the recession. The recovery this year is in large part because the savings ratio has begun to fall. We should be cautious here, because data on the savings ratio is notoriously subject to revision. However if we look at the main component of income, compensation of employees, this rose by 3.4%, while nominal consumption rose by 4.4%, again indicating a reduction in savings.  

So the recovery so far is essentially down to less saving/more borrowing, with a minor contribution from investment in dwellings (house building). As Duncan Weldon suggests, the Funding for Lending scheme may be an important factor here. However it may also just be the coming to an end of a balance sheet adjustment, with consumers getting their debts and savings nearer a place they want them to be following the financial crash.

I cannot help but repeat an observation that I have made before at this point. Macro gets blamed for not foreseeing the financial crisis, although I suspect if most macroeconomists had seen this data before the crash they would have become pretty worried. But what macro can certainly be blamed for is not having much clue about the proportion of consumers who are subject to credit constraints, and for those who are not, what determines their precautionary savings: see this post for more. This is why no one really knew when the savings ratio would start coming down, and no one really knows when this will stop.

Some people have argued that we should be suspicious about this recovery, because it involves consumers saving less and borrowing more. Some of the fears behind this are real. One fear is that, encouraged by Help to Buy, the housing market will see a new bubble, and many people will get burned as a result. Another is that some households will erroneously believe that ultra low interest rates are here forever, and will not be able to cope when they rise. But although these are legitimate concerns, which macroprudential policy should try and tackle, the truth is that one of the key ways that monetary policy expands the economy is by getting people to spend more and save less. So if we want a recovery, and the government does not allow itself fiscal stimulus, and Europe remains depressed because of austerity, this was always going to be how it happens. [1]

However there is a legitimate point about a recovery that comes from a falling savings ratio which is that the savings ratio cannot go on falling forever. The moment it stops falling, consumption growth will match income growth. The hope must be that it will continue for long enough to get business investment rising more rapidly, and for the Eurozone to start growing again so that exports can start increasing. But the big unknown remains productivity. So far, the upturn in growth does not seem to have been accompanied by an upturn in productivity. In the short term that is good because it reduces unemployment, but if it continues it will mean real wages will not increase by much, which in turn will mean at some point consumption growth will slow.

There is a great set of graphs in this post at Flip Chart Fairy Tales which illustrate the scale of the productivity problem. (Rick - apologies for not discovering your blog earlier.) For example the OBR, in November 2010, were expecting real wages in 2015 to be 10% higher than in their recent Autumn Statement forecast. We will not recover the ground lost as a result of the recession until productivity growth starts exceeding pre-recession averages. As Martin Wolf and I suggest, the Chancellor should be focusing on the reasons for the UK’s productivity slowdown rather than obsessing about the government’s budget deficit.

[1] In theory it could have happened through a large increase in investment. However the experience of the recession itself, and more general evidence, suggests that investment is strongly influenced by output growth. That is why investment has not forged ahead as a result of low interest rates, and why firms continue to say that a shortage of finance is not holding them back. Having said that, I would have prefered the government to try fiscal incentives to bring forward investment rather than implement measures aimed at raising house prices. 


Saturday, 21 December 2013

The Conservatives and the ghost of Christmas past

In October 2002 Theresa May, the then Chairman of the Conservative Party, said to her party’s conference: "There's a lot we need to do in this party of ours. Our base is too narrow and so, occasionally, are our sympathies. You know what some people call us – the Nasty Party." That tag owes something to the contrast between the public images of Margaret Thatcher and Tony Blair: the Iron Lady compared to Blair’s easy informality. In terms of policies it is not totally clear that the label was deserved. Poverty increased, but the poor were not denigrated. Unions were broken, but many felt the unions had become too powerful and selfish in their use of power. The state was reduced by privatising utilities, but the welfare state was not seriously diminished. Unemployment rose substantially, but inflation had to be brought under control. But whether deserved or not, I think May was right in her observation.

David Cameron also appears to have believed that the Conservatives had this image problem, and in opposition he aimed to create the idea of a modern compassionate Conservative Party. Hoodies were to be hugged, environmental goals embraced, and most tellingly of all, rather than deny the relevance of ‘society’, he wanted to create a ‘Big Society’. I am not concerned here about how real or radical these changes were, but instead just note that he felt a change of image was necessary to end the Conservative’s run of election defeats. The fact that they did not win the 2010 election outright perhaps suggests the strength and toxicity of the nasty brand.

What a difference a few years make. As the government finds it more and more difficult to cut government spending on goods and services, it aims austerity at welfare spending. There is plenty that has already happened, some well known, some not. As to the future, here is Paul Johnson of the IFS talking about the implications of the latest Autumn Statement. The scale of cuts he is talking about for welfare are huge (particularly if state pensions are ring fenced), yet they appear to be Osborne’s preferred option. The Conservative’s current Party Chairman  and an influential MP have recently suggested restricting benefits for those with more than two children, to encourage ‘more responsible’ decisions about procreation. Never mind the impact this would have on those children.

Changes to welfare already introduced, together with falling real wages, have led to a huge rise in the use of food banks in the UK. Here is data from the Trussell Trust, one of the main operators of voluntary food banks. 346,992 people received a minimum of three days emergency food from Trussell Trust food banks in 2012-13, compared to 26,000 in 2008-09. Of those helped in 2012-13, 126,889 (36.6 percent) were children. The Red Cross is to start distributing food aid in the UK, for the first time since WWII. A letter from doctors to the British Medical Journal talks about a potential public health emergency involving malnutrition. It is undeniable that benefit changes are a big factor behind these developments, yet the government seems intent on hiding this fact. 

Actions are of course more important than rhetoric, but rhetoric can help define image. It is undeniable that ministers, including the Prime Minister and Chancellor, have attempted to portray the poor and unemployed as personally responsible for their position due to some character failure. Even a proud institution like HM Treasury cannot resist being part of this process. (‘Hard-working families’ looks like going the same way as ‘taxpayers money’, becoming a routine slight against either the unemployed or the poor.) Both Cameron and Osborne will be too careful to emulate Romney’s 47% moment, but too many Conservative MPs appear to share the attitudes of some of those on the US right.

So what accounts for this U turn from compassion to disparagement? The recession is one answer, which has hardened social attitudes. The success of UKIP, the political wing of the majority of UK newspapers, is another. [1] Yet it seems incredible that a political calculation that appeared valid before 2010 can have been so completely reversed in just a few years. Even Theresa May, whose speech started this blog, has joined in on the act. There are those vans of course, but asking landlords to check the immigration status of tenants is an incredibly stupid and harmful policy. We will see in 2015 whether it pays to be nasty. [2]

Yet even if the strategy works in the short term, and even recognising that politicians often do questionable things to gain votes, this just seems a step too far. It is one thing to create hardship because you believe this is a necessary price to improve the system or reduce its cost. Perhaps you really believe that cutting the top rate of tax at the same time as cutting welfare will benefit everyone eventually. But it is quite another thing to try and deflect any criticism by unjustly blaming those who earn too little, or who are trying to find work. That just seems immoral.

I suspect Cameron as the Compassionate Conservative would have agreed. He would have also noted that, although nastiness might accord with voter sentiments today, at some point in the future voters in more generous times will have no problem forgetting this, and just remembering the Conservatives as the nasty party. As Christmas approaches, this tale from Charles Dickens seems apt. 

[1] For those who are offended by this sentence, let me say this. There are two obvious explanations for the correlation between UKIPs policies and the views of the Telegraph, Mail and Sun. One involves the causality implied by the sentence and the post that it links to. The other is that newspapers just reflect the concerns of voters. But if the latter is true why do they (with the odd exception) just reflect the views of voters on the right, rather than those on the left? And why do the mistaken beliefs of voters tend to correlate with the impressions created by these newspapers, as I note here?  

[2] Even if it does, I strongly suspect one casualty will be the LibDems. If their leader spoke out as Vince Cable has done, they might just have a chance of not being associated with these policies and attitudes. But he has not, and as a result the party is in serious danger of losing many votes and I suspect much of its activist base. 






Thursday, 19 December 2013

More on the illusion of superiority

For economists, and those interested in methodology

Tony Yates responds to my comment on his post on microfoundations, but really just restates the microfoundations purist position. (Others have joined in - see links below.) As Noah Smith confirms, this is the position that many macroeconomists believe in, and many are taught, so it’s really important to see why it is mistaken. There are three elements I want to focus on here: the Lucas critique, what we mean by theory and time.

My argument can be put as follows: an ad hoc but data inspired modification to a microfounded model (what I call an eclectic model) can produce a better model than a fully microfounded model. Tony responds “If the objective is to describe the data better, perhaps also to forecast the data better, then what is wrong with this is that you can do better still, and estimate a VAR.” This idea of “describing the data better”, or forecasting, is a distraction, so let’s say I want a model that provides a better guide for policy actions. So I do not want to estimate a VAR. My argument still stands.

But what about the Lucas critique? Surely that says that only a microfounded model can avoid the Lucas critique. Tony says we might not need to worry about the Lucas critique if policy changes are consistent with what policy has done in the past. I do not need this, so let’s make our policy changes radical. My argument still stands. The reason is very simple. A misspecified model can produce bad policy. These misspecification errors may far outweigh any errors due to the Lucas critique. Robert Waldmann is I think making the same point here. (According to Stephen Williamson, even Lucas thinks that the Lucas critique is used as a bludgeon to do away with ideas one doesn't like.)

Stephen thinks that I think the data speaks directly to us. What I think is that the way a good deal of research is actually done involves a constant interaction between data and theory. We observe some correlation in the data and think why that might be. We get some ideas. These ideas are what we might call informal theory. Now the trouble with informal theory is that it may be inconsistent with the rest of the theory in the model - that is why we build microfounded models. But this takes time, and in the meantime, because it is also possible that the informal theory may be roughly OK, I can incorporate it in my eclectic model.[1] In fact we could have a complete model that uses informal theory - what Blanchard and Fischer call useful models. The defining characteristic of microfounded models is not that they use theory, but that the theory they use can be shown to be internally consistent.

Now Tony does end by saying “ad-hoc modifications seem attractive if they are a guess at what a microfounded model would look like, and you are a policymaker who can’t wait, and you find a way to assess the Lucas-Critique errors you might be making.” I have dealt with the last point – it’s perfectly OK to say the Lucas critique may apply to my model, but that is a price worth paying to use more evidence than a microfounded model does to better guide policy. For the sake of argument let’s also assume that one day we will be able to build a microfounded model that is consistent with this evidence. (As Noah says, I’m far too deferential, but I want to persuade rather than win arguments.) [2] In that case, if I’m a policy maker who cannot wait for this to happen, Tony will allow me my eclectic model.

This is where time comes in. Tony’s position is that policymakers in a hurry can do this eclectic stuff, but we academics should just focus on building better microfoundations. There are two problems with this. First, building better microfoundations can take a very long time. Second, there is a great deal that academics can say using eclectic, or useful, models.

The most obvious example of this is Keynesian business cycle theory. Go back to the 1970s. The majority of microfoundations modellers at that time, New Classical economists, said price rigidity should not be in macromodels because it was not microfounded. I think Tony, if he had been writing then, would have been a little more charitable: policymakers could put ad hoc price rigidities into models if they must, but academics should just use models without such rigidities until those rigidities could be microfounded.

This example shows us clearly why eclectic models (in this case with ad hoc price rigidities) can be a far superior guide for policy than the best microfounded models available at the time. Suppose policymakers in the 1970s, working within a fixed exchange rate regime, wanted to devalue their currency because they felt it had become overvalued after a temporary burst of domestic inflation. Those using microfounded models would have said there was no point - any change in the nominal exchange rate would be immediately offset by a change in domestic prices. (Actually they would probably have asked how the exchange rate can be overvalued in the first place.) Those using eclectic models with ad hoc price rigidities would have known better. Would those eclectic models have got things exactly right? Almost certainly not, but they would have said something useful, and pointed policy in the right direction.

Should academic macroeconomists in the 1970s have left these policymakers to their own devices, and instead got on with developing New Keynesian theory? In my view some should have worked away at New Keynesian theory, because it has improved our understanding a lot, but this took a decade or two to become accepted. (Acceptance that, alas, remains incomplete.) But in the meantime they could also have done lots of useful work with the eclectic models that incorporated price stickiness, such as working out what policies should accompany the devaluation. Which of course in reality they did: microfoundations hegemony was less complete in those days.

Today I think the situation is rather different. Nearly all the young academic macroeconomists I know want to work with DSGE models, because that is what gets published. They are very reluctant to add what might be regarded as ad hoc elements to these models; however strong the evidence and informal theory might be that could support any modification. They are also understandably unclear about what counts as ad hoc and what does not. The situation in central banks is not so very different.

This is a shame. The idea that the only proper way to do macro that involves theory is to work with fully microfounded DSGE models is simply wrong. I think it can distort policy, and can hold back innovation. If our DSGE models were pretty good descriptions of the world then this misconception might not matter too much, but the real world keeps reminding us that they are not. We really should be more broad minded. 

[1] Suppose there is some correlation in the past that appears to have no plausible informal theory that might explain it. Including that in our eclectic model would be more problematic, for reasons Nick Rowe gives.


[2] I suggest why this might not be the case here. Nick Rowe discusses one key problem, while comments on my earlier post discuss others.

Tuesday, 17 December 2013

Osborne’s Plan B

Three months ago, when I was complaining about the economic illiteracy shown in the Financial Times leader entitled “Osborne wins the battle on austerity”, I focused on the general point that austerity meant delaying the recovery, not preventing it ever happening. However there was another sense in which that leader, and all the similar comments made by many others, was wrong. Numbers from the latest OBR forecast allow me to give more detail on how Plan A was in fact put on hold, and the recovery we have had has followed a suspension of austerity. (Another reason for doing this is that some of the headline numbers are distorted by special factors, which the OBR corrects for now, but which may get lost later.)

To quote from that FT leader: “Since the austerity policy was still under way, claims that a “Plan B” was necessary to trigger a recovery had been proved wrong, [Osborne] argued. Mr Osborne has won the political argument.” From which you would gather that we are still following Plan A, which involved steadily reducing borrowing by, among other things, reducing the share of government spending in GDP.

So here are the numbers.

Osborne's Plan B

2009
2010
2011
2012
2013
2014
2015
2016
2017
Deficit/GDP  
11.0
9.3
7.9
7.8
7.5
6.5
5.5
3.7
2.3
Cyclically Adj
8.7
7.1
6.0
5.9
5.1
4.0
3.2
1.7
0.7
G Con growth
0.7
0.5
-0.1
2.6
0.4
-0.7
-0.4
-1.0
-1.8
G Con/GDP 
23.2
22.7
22.0
21.8
21.0
20.5
19.7
18.8
17.7
Notes. First two rows are for financial years, second two calendar years. Row 1: Deficit/GDP is PSNB/GDP as a %, adjusted for Royal Mail and APF. Row 2 is the same, cyclically adjusted. Sources for both are OBR December Autumn Statement Forecast Table 4.34 and their historical database. Row 3, percentage growth of real government consumption, source OBR Table 1.1 and ONS. Row 4, source OBR Chart 3.29.

We start with the deficit as a percentage of GDP. This was very high in 2009, partly because of the recession, but also to stimulate the economy. Austerity began in 2010, and continued in 2011, with sharp falls in the deficit. But in 2012 the deficit was much the same as it was in 2011. There is then a (forecast) modest fall in 2013, followed by a projected return to austerity from 2014 onwards. We get a similar pattern if we cyclically adjust these numbers (using OBR estimates of the output gap).

An alternative summary statistic to judge the fiscal stance is to look at government consumption of real goods and services. Although this tells only half the fiscal story, because it ignores taxes and transfers, many people will be able to smooth the income effects of tax changes through saving. Government consumption, however, feeds straight through into demand. Real Government Consumption showed little growth in 2010, fell slightly in 2011, but increased by over 2.5% in 2012. These numbers depend on the government consumption deflator, which may not be very well measured, so the final row shows the share of government consumption in GDP. The pattern is similar.

So there you have it. Plan A was temporarily abandoned. Austerity stalled. Was that important in boosting the recovery that followed in 2013? We cannot know for sure, but that is not the key issue here. The important point was that Plan A was clearly put on hold. Claims that the government stuck to Plan A are false. The reason Plan A was abandoned, of course, was that it was delaying the recovery, and the government needed a recovery before the next election.


Nothing I have said here is particularly new – Jonathan Portes made the same point in September, for example. Of course the right wing media (which the FT shows signs of wanting to join) ignore these facts. The interesting question is whether others in the media will follow their lead. They did so with the myth of Labour profligacy – but that was based on a half-truth. They continue to treat the economic impact of austerity as controversial (despite the OBR and European Commission numbers discussed here and here), but there will always be some economists who make it appear so. But in this case the numbers are actual data, and their implication is absolutely clear. So we will see if the myth that Osborne stuck to Plan A survives, and we really do have to live with a post-truth media. 

Monday, 16 December 2013

Microfoundations - the illusion of necessary superiority

Tony Yates has written a long discussion of microfounded macro, which I think reflects the views of many academic macroeconomists. I agree with him that microfoundations have done a great deal to advance macroeconomics. It is a progressive research program, and a natural way for macroeconomic theory to develop. That is why I work with DSGE models. Yet in one respect I think he is mistaken, which is in his presumption that microfounded models are naturally superior as a tool for policy actions.

Let me try and paraphrase his discussion. There are basically two ways to model the economy. The first is using microfounded models, which when done right avoid the Lucas critique. The second is to do something like a VAR, which lets the data speak, although doing policy with a VAR is problematic.

So what about aggregate models that use a bit of theory and a bit of econometrics? Let me quote.

A final possibility is that there is no alternative but to proceed in non-micro-founded way.  Yet some business has to be done – some policy decision, or some investment based on a forecast.  In these circumstances, it’s ok to take a stab at the decision rules or laws of motions for aggregates in an economy might look like if you could micro-found what you are concerned with, and move on.  Perhaps doing so will shed light on how to do it properly.  Or at least give you some insight into how to set policy.  Actually many so called microfounded models probably only have this status;  guesses at what something would look like if only you could do it properly.”

As the language makes clear, we are talking at least second best here. Tony would not go so far as to outlaw non-microfounded models, but any such models are clearly inferior to doing “it properly”.

Yet what is the basis of this claim? A model should be as good a representation of the economy as possible for the task in hand. The modeller has two sources of information to help them: micro theory about how individuals may behave, and statistical evidence about how the aggregate economy has behaved in the past. Ideally we would want to exhaust both sets of information in building our model, but our modelling abilities are just not good enough. There is a lot in the data that we cannot explain using micro theory.

Given this, we have three alternatives. We can focus on microfoundations. We can focus on the data. Or we can do something in between - let me call this the eclectic approach. We can have an aggregate model where equation specification owes something to theory, but also attempts to track more of the data than any microfounded model would try to do. I can see absolutely no reason why taking this eclectic approach should produce a worse representation of the economy than the other two, whether your business is policy or forecasting.

Let’s take a very basic example. Suppose in the real world some consumers are credit constrained, while others are infinitely lived intertemporal optimisers. A microfoundation modeller assumes that all consumers are the latter. An eclectic modeller, on finding that consumption shows excess sensitivity to changes in current income, adds a term in current income into their aggregate consumption function, which otherwise follows the microfoundations specification. Which specification will perform better? We cannot know for sure, but I see circumstances in which the ‘ad hoc’ eclectic specification would do better than the misspecified microfounded model. (I give a more sophisticated example related to Friedman’s PIH and precautionary saving here.)

Now a microfoundation modeller might respond that the right thing to do in these circumstances is to microfound these credit constrained consumers. But that just misses the point. We are not talking about research programmes, but particular models at particular points in time. At any particular time, even the best available microfounded model will be misspecified, and an eclectic approach that uses information provided by the data alongside some theory may pick up these misspecifications, and therefore do better.

Another response might be that we know for sure that the eclectic model will be wrong, because (for example) it will fail the Lucas critique. More generally, it will not be internally consistent. But we also know that the microfounded model will be wrong, because it will not have the right microfoundations. The eclectic model may be subject to the Lucas critique, but it may also - by taking more account of the data than the microfounded model - avoid some of the specification errors of the microfounded model. There is no way of knowing which errors matter more.

It’s easy to see why eclectic models have a hard time. Because they look at both theory and data, they will never satisfy both theorists and econometricians. But that does not make them necessarily inferior to either microfounded models or VARs. We can speculate on reasons why, on at least some occasions, eclectic models may do better. But the key point I want to make here is that I do not know of any epistemological reasons for thinking eclectic models must be inferior to microfounded models, yet many macroeconomists seem to believe that they are.


Sunday, 15 December 2013

Inequality and the Left

In the debate over inequality and priorities set off by Ezra Klein’s article, Kathleen Geier writes (HT MT) “the policy fixes for economic inequality are fairly clear: in no particular order, they include a higher minimum wage, stronger labor unions, a more progressive tax system, a more generous social welfare state, macroeconomic policies that promote a full employment economy, and much more powerful government regulations, particularly in the banking and finance sector.” And part of me thought, do we really want to go back to the 1970s?

Maybe this is being unfair for two reasons. First, in terms of the strength of unions, or the progressivity of taxes, the 1970s in the UK was rather different from the 1970s in the US. Second, perhaps all we are talking about here is swinging the pendulum back a little way, and not all the way to where it was before Reagan and Thatcher. Yet perhaps my reaction explains why inequality is hardly discussed in public by the mainstream political parties – at least in the UK.

The 1997-2010 Labour government was very active in attempting to reduce poverty (with some success), but was “intensely relaxed about people getting filthy rich as long as they pay their taxes.” This was not a whim but a strategy. It wanted to distance itself from what it called ‘Old Labour’, which was associated in particular with the trade unions.  Policies that were explicitly aimed at greater equality were too close to Old Labour [1], but policies that tackled poverty commanded more widespread support. Another way of saying the same thing was that Thatcherism was defined by its hostility to the unions, and its reduction of the top rates of income tax, rather than its hostility to the welfare state. 

I think these points are important if we want to address an apparent paradox. As this video illustrates (here is the equivalent for the US), growing inequality is not popular. Fairness is up there with liberty as a universally agreed goal, and most people do not regard the current distribution of income as fair. In addition, evidence that inequality is associated with many other ills is becoming stronger by the day. Yet the UK opposition today retains the previous government’s reluctance to campaign on the subject.

This paradox appears all the more perplexing after the financial crisis, for two reasons. First the financial crisis exploded the idea that high pay was always justified in terms of the contribution those being paid were making to society. High paid bankers are one of the most unpopular groups in society right now, and it would be quite easy to argue that these bankers have encouraged other business leaders to pay themselves more than they deserve. Second, while Thatcherism did not attempt to roll back the welfare state, austerity has meant that the political right has chosen to paint poverty as laziness. As a result, reducing poverty is no longer an uncontroversial goal. [2]

What is the answer to this paradox? Why is tackling inequality not seen as a vote winner on the mainstream left? I can think of two possible answers, but I’m not confident about either. One, picking up from the historical experience I discussed above, is that reducing inequality is still connected in many minds with increasing the power of trade unions, and this is a turn-off for voters. A second is that it is not popular opinion that matters directly, but instead the opinion of sections of the media and business community that are not forever bound to the political right. Politicians on the left may believe that they need some support from both sectors if they are to win elections. Policies that reduce poverty, or reduce unemployment, do not directly threaten these groups, while policies that might reduce the incomes of the top 10% do.

This leads me to one last argument, which extends a point made by Paul Krugman. I agree with him that “we know how to fight unemployment — not perfectly, but good old basic macroeconomics has worked very well since 2008.... The causes of soaring inequality, on the other hand, are more mysterious; so are the channels through which we might reverse this trend. We know some things, but there is much more room for new knowledge here than in business cycle macro.” My extension would be as follows. The main reason why governments have failed to deal with unemployment are accidental rather than intrinsic: the best instrument available in a liquidity trap (additional government spending) conflicts with the desire of those on the right to see a smaller state. (Those who oppose all forms of stimulus are still a minority.) In contrast, reversing inequality directly threatens the interests of most of those who wield political influence, so it is much less clear how you overcome this political hurdle to reverse the growth in inequality.

[1] This association is of course encouraged by the political right, which is quick to brand any attempt at redistribution as 'class war'. 

[2] The financial crisis did allow the Labour government to create a new top rate of income tax equal to 50%, but this was justified on the basis that the rich were more able to shoulder the burden of reducing the budget deficit, rather than that they were earning too much in the first place.  

Friday, 13 December 2013

The UK recovery and the pessimists’ refrain

In two recent posts I plotted UK GDP per person since 1950. What is remarkable to me about that time series is how well a simple trend tracks the data – until the current recession. In reality trend growth rates have probably moved around a bit, but the important point is that past recessions have essentially turned out to be temporary deviations from trend growth, rather than signaling fundamental shifts. In particular, following both the major recessions of the early 1980s and 1990s, we achieved recoveries that brought us back to something close to the level of output we could have achieved if there had been no recession.

There are two arguments that this time will be different: what I will call the pessimists’ refrain. The first is that we were fooling ourselves before the recession, because in reality we were ‘living beyond our means’. This argument suggests that output in 2007 was unsustainably high, so our trend line should be lower and flatter. The second is that since 2008/9 productivity has stalled because the financial system in the UK has been broken. I call these ‘refrains’, because they usually come with a repeated message: we should stop stimulating the economy, because if we try to get back the ground we lost, we will fail and instead generate inflation. (A third argument focuses on hysteresis effects of high unemployment, but this is more important for the US than the UK.)

Larry Summers has recently reminded us that there was something odd about the pre-recession period in a number of countries including the US and UK. Although inflation was reasonably stable, we had the kind of housing bubbles that we normally associate with booms. Now Summers argued that this indicated an underlying demand weakness: we could only get people to buy all the stuff we could produce by encouraging them to take out an unusual amount of debt. There is an alternative explanation, which is that - despite appearances - there was a boom before the recession. We were, in 2007, living beyond our means. 

This idea is not the view of a small minority. Organisations like the OECD and IMF now calculate that in 2007 output gaps were large and positive. The latest OECD Economic Outlook gives 3.3% for the OECD area as a whole, 3.5% for the Euro area, 2.9% for the US and 4.4% for the UK. That is not what these organisations were saying at the time. In the June 2008 Economic Outlook, the equivalent numbers were 0.4%, 0.0%, 0.4% and 0.2%. At the time it looked like output in 2007 was close to the natural rate in many countries, including the UK. [1]

This change of view on output gaps, where 2007 goes from balance to a significant boom, is largely inevitable given the way the OECD and IMF calculate these numbers. Since the recession productivity in many countries has been much lower than we might have expected (in the UK it’s the ‘productivity puzzle’), which seems to indicate a fall in how much technical progress has been embodied in production. Traditionally we have thought of technical progress changing gradually, and as being largely unrelated to the economic cycle. If productivity is low today and technical progress only changes gradually, then it follows that some of this slackening in the pace of technical progress must have started before the recession. So in 2007 the economy was not capable of producing as much as we thought at the time.

What hard evidence do we have for all this? The living beyond our means case really comes down to the idea that both the housing bubble and the build up of personal debt must imply there was a boom. Yet this is really one piece of evidence rather than two. As Ben Broadbent shows, the build-up of debt was matched by an increase in assets: the value of houses. (The same appears to be true in the US.) This increase in personal sector indebtedness might have been foolish given what happened to house prices, but the underlying problem was the housing bubble.

Was the housing bubble an indication of a ‘hidden boom’ in 2007? There is a quite plausible alternative explanation, which is that you can get housing booms when real interest rates are low. Many economists, including Ben Bernanke, pointed out before the recession the unusually low returns on long term assets like government debt, an idea that became known as the ‘savings glut’. The return on assets was being driven down because consumers in China and its neighbours were saving extraordinary amounts, leading to large current account surpluses there. When returns on safe assets like government debt are low, people look elsewhere for higher returns, and this includes the housing market. This is probably why we are seeing rapid increases in house prices today in a number of major cities (e.g. London, Paris, Germany).

Not only is there this alternative explanation for the housing bubble, but the living beyond our means case cannot satisfactorily account for why - if we had a huge boom - inflation remained so subdued. What upward movement there was could easily be explained by large increases in oil and commodity prices. It is sometimes argued that inflation targeting, or cheap goods from China, kept a lid on consumer price inflation, but there was no indication of any overheating in the labour market either.

The second refrain, much more specific to the UK, focuses on the banking sector. The argument is not that the financial sector grew too rapidly before the recession, and must inevitably shrink back to a more normal size. That is almost certainly true, but the numbers just do not seem large enough. As Martin Wolf points out, the financial sector went from 6% of GDP in 1998 to 9% in 2008. Even if it returns to 6%, for the economy as a whole much of that should be recoverable, because most of those who have lost their jobs in the financial sector are highly skilled and so can be redeployed in other high productivity activities.

The argument is instead that, for the supply side of the economy to grow, more productive firms need to replace inefficient competitors, and new innovative start-ups should challenge existing firms. Both processes almost certainly require borrowing, and in the UK in particular that borrowing usually comes from a few large banks. If these banks stop lending, productivity growth will fall away. This argument is plausible, and can help explain two puzzles about the current recession. The first is why, when UK firms are asked how much spare capacity they have, they respond that they have very little. This is not consistent with the recession being only about lack of demand. The second is that inflation has not been falling more rapidly. If firms cannot get the finance to grow, there is little point in trying to expand your market by cutting prices.

What is frustrating about this idea is that there is little hard evidence either way. Bank lending to firms has certainly fallen, but how much of this is down to banks, and how much is simply that firms think it is too risky to borrow? (This study suggests at least some of the former.) There is a great deal of anecdotal evidence that some banks have in the last few years might have been hindering rather than helping small businesses, but translating this into actual numbers for productivity lost is almost impossible. The most compelling argument that something like this has been going on is that other explanations for the UK’s productivity puzzle are either implausible or inadequate in terms of scale.

Perhaps the key question, though, is how permanent this all is? If bank lending starts to recover, can we get back the productivity we have lost? We can look at past financial crises in other countries, as Nick Oulton has done. (His paper also provides useful detail on why other explanations for the productivity puzzle do not seem to work. The section on fiscal policy, however, departs from his usual high standards, as this article in Pieria suggests.) His analysis indicates some permanent hit to GDP from a financial crisis, but the larger numbers come from Latin America. The example of Sweden in the early 1990s is much more optimistic. A priori we might expect a good deal of the innovation to have been ‘put on hold’ because of lack of finance, and this could be activated once banks’ balance sheets are repaired. But perhaps some opportunities may have been lost forever.

Given these uncertainties, two implications for policy seem clear. First, we should stimulate demand until there are clear signs of overheating in both the goods and labour markets. We should only do otherwise if the pessimistic case is compelling, and it is not.

Second, to the extent that we do not recover the ground that we lost in the recession, the costs of the financial crisis are even larger than we thought. This suggests we must do something to make the economy less dependent on the behaviour of a small number of large UK banks. What is interesting about at least some of those who sing the pessimists’ refrain is that they seem to treat this implied loss of UK capacity like an act of God: not only is it something we can do nothing to reverse, but there is little point in investigating it further. That is a strange attitude to take, particularly given that there has been no equivalent productivity puzzle in the country where the banking crisis really started. (This may be another reason to praise small US banks: see Felix Salmon here.) Some who are pessimistic preach the usual neoliberal message for enhancing growth, including lower taxes on high incomes, but seem strangely uninterested in what has caused this alleged huge reduction in supply. Perhaps they fear the answers to the UK’s productivity puzzle will not be to their liking.

[1] For the CBO’s assessment of US potential, see Menzie Chinn.