1) The Queen needs better advisers, e.g., not economists.
2) Macro Economic Theory is a bad case of penis envy, e.g., we got problems.
The Map is Not the Territory: An Essay on the State of Economics
04 October 2011, Institute for New Economic Thinking
Click here to read the original article, along with responses from other economists, at the Institute for New Economic Thinking.
The reputation of economics and economists, never high, has been a victim of the crash of 2008. The Queen was hardly alone in asking why no one had predicted it. An even more serious criticism is that the economic policy debate that followed seems only to replay the similar debate after 1929. The issue is budgetary austerity versus fiscal stimulus, and the positions of the protagonists are entirely predictable from their previous political allegiances.
The doyen of modern macroeconomics, Robert Lucas, responded to the Queen’s question in a guest article in The Economist in August 2009.[1] The crisis was not predicted, he explained, because economic theory predicts that such events cannot be predicted. Faced with such a response, a wise sovereign will seek counsel elsewhere.
But not from the principal associates of Lucas, who are even less apologetic. Edward Prescott, like Lucas, a Nobel Prize winner, began a recent address to a gathering of Laureates by announcing ‘this is a great time in aggregate economics’. Thomas Sargent, whose role in developing Lucas’s ideas has been decisive, is more robust still.[2] Sargent observes that criticisms such as Her Majesty’s ‘reflect either woeful ignorance or intentional disregard of what modern macroeconomics is about’. ‘Off with his head’, perhaps. But before dismissing such responses as ridiculous, consider why these economists thought them appropriate.
In his lecture on the award of the Nobel Prize for Economics in 1995,[3] Lucas described his seminal model. That model developed into the dominant approach to macroeconomics today, now called dynamic stochastic general equilibrium. In that paper, Lucas makes (among others) the following assumptions: everyone lives for two periods, of equal length, and works for one and spends in another; there is only one good, and no possibility of storage of that good, or of investment; there is only one homogenous kind of labour; there is no mechanism of family support between older and younger generations. And so on.
All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.
But Lucas and those who follow him were plainly engaged in a very different exercise, as the philosopher Nancy Cartwright has explained.[4] The distinguishing characteristic of their approach is that the list of unrealistic simplifying assumptions is extremely long. Lucas was explicit about his objective[5] – ‘the construction of a mechanical artificial world populated by interacting robots that economics typically studies’. An economic theory, he explains, is something that ‘can be put on a computer and run’. Lucas has called structures like these ‘analogue economies’, because they are, in a sense, complete economic systems. They loosely resemble the world, but a world so pared down that everything about them is either known, or can be made up. Such models are akin to Tolkien’s Middle Earth, or a computer game like Grand Theft Auto.
The knowledge that every problem has an answer, even and perhaps especially if that answer may be difficult to find, meets a deeply felt human need. For that reason, many people become obsessive about artificial worlds, such as computer games, in which they can see the connection between actions and outcomes. Many economists who pursue these approaches are similarly asocial. It is probably no accident that economics is by far the most male of the social sciences.
One might learn skills or acquire useful ideas through playing these games, and some users do. If the compilers are good at their job, as of course they are, the sound effects, events, and outcomes of a computer game resemble those we hear and see – they can, in a phrase that Lucas and his colleagues have popularised, be calibrated against the real world. But that correspondence does not, in any other sense, validate the model. The nature of such self-contained systems is that successful strategies are the product of the assumptions made by the authors. It obviously cannot be inferred that policies that work in Grand Theft Auto are appropriate policies for governments and businesses.
Yet this correspondence does seem to be what the proponents of this approach hope to achieve – and even claim they have achieved. The debate on austerity versus stimulus, in academic circles, is in large part a debate about the validity of a property called Ricardian equivalence, which is observed in this type of model. If government engages in fiscal stimulus by spending more or by reducing taxes, people will realise that such a policy means higher taxes or lower spending in future. Even if they seem to be better off today, they will be poorer in future, and by a similar amount. Anticipating this, they will cut back and government spending will crowd out private spending. Fiscal policy is therefore ineffective as a means of responding to economic dislocation.
In a more extended defence of the DSGE approach, John Cochrane, Lucas’s Chicago colleague, puts forward the policy ineffectiveness thesis – immediately acknowledging that the assumptions that give rise to it ‘are, as usual, obviously not true’.[6] For most people, that might seem to be the end of the matter. But it isn’t. Cochrane goes on to say that ‘if you want to understand the effects of government spending, you have to specify why the assumptions leading to Ricardian equivalence are false’. That is a reasonable demand, though one that is easy to satisfy – as Cochrane himself readily acknowledges.
But Cochrane will not give up so easily. He goes on; ‘economists have spent a generation tossing and turning the Ricardian equivalence theory and assessing the likely effects of fiscal stimulus in its light, generalising the “ifs” and figuring out the likely “therefores”. This is exactly the right way to do things’. The programme Cochrane describes modifies the core model in a rather mechanical way that makes it more complex, but not necessarily more realistic, by introducing additional parameters that have labels such as ‘frictions’ or ‘transactions costs’ – in much the same way as a game compiler might introduce a new module or sound effect.
Why is this ‘exactly the right way to do things’? There are at least two alternative ways to proceed. You could build a different analogue economy. Joe Stiglitz, for example, favours a model that retains many of Lucas’s assumptions, but gives critical importance to imperfections of information.[7] After all, Ricardian equivalence requires that households have a great deal of information about future budgetary options, or at least behave as if they did. A more radical modification might be an agent-based model, for example, which assumes households respond routinely to events according to specific behavioural rules. Such models can also ‘be put on a computer and run’. It is not obvious in advance – or, generally, in retrospect – whether the assumptions, or conclusions, of these models are more, or less, plausible than those of the kind of model favoured by Lucas and Cochrane.
But another approach would discard altogether the idea that the economic world can be described by a universally applicable model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but which cannot be described fully, or perhaps at all, by the kinds of variables and equations with which economists are familiar. Models, when employed, must therefore be context specific, in the manner suggested in a recent book by Roman Frydman and Michael Goldberg.[8]
In that eclectic world, Ricardian equivalence is no more than a suggestive hypothesis. It is possible that some such effect exists. One might be sceptical about whether it is very large, and suspect its size depends on a range of confounding and contingent factors – the nature of the stimulus, the overall political situation, the nature of financial markets and welfare systems. This is what the generation of economists who followed Keynes did when they estimated a consumption function – they tried to measure how much of a fiscal stimulus was spent – and the ‘multiplier’ that resulted.
But you would not nowadays be able to publish similar articles in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. You might be accused of the cardinal sin of being ‘ad hoc’. Rigour and consistency are the two most powerful words in economics today.
They have undeniable virtues, but for economists they have particular interpretations. Consistency means that any statement about the world must be made in the light of a comprehensive descriptive theory of the world. Rigour means that the only valid claims are logical deductions from specified assumptions. Consistency is therefore an invitation to ideology, rigour an invitation to mathematics. This curious combination of ideology and mathematics is the hallmark of what is often called ‘freshwater economics’ – the name reflecting the proximity of Chicago, and other centres such as Minneapolis and Rochester, to the Great Lakes.
Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are complete artificial worlds, like those of Grand Theft Auto, which can ‘be put on a computer and run’.
For many people, deductive reasoning is the mark of science, while induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty’, says Cochrane, ‘is logical consistency’. It seems impossible that anyone acquainted with great human achievements – whether in the arts, the humanities or the sciences – could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso – or Newton or Darwin – approached their task.
The issue is therefore not mathematics versus poetry. Deductive reasoning of any kind necessarily draws on mathematics and formal logic; inductive reasoning is based on experience and above all on careful observation and may, or may not, make use of statistics and mathematics. Much scientific progress has been inductive: empirical regularities are observed in advance of any clear understanding of the mechanisms that give rise to them. This is true even of hard sciences such as physics, and more true of applied disciplines such as medicine or engineering. Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word. With exquisite irony, Lucas holds a chair named for John Dewey, the theorist of American pragmatism.
Engineers and doctors can perhaps be criticised for attaching too much weight to their own experience and personal observations. They are often sceptical, not just of theory, but of data they have not themselves collected. In contrast, most modern economists make no personal observations at all. Empirical work in economics, of which there is a great deal, predominantly consists of the statistical analysis of large data sets compiled by other people.
Few modern economists would, for example, monitor the behaviour of Procter and Gamble, assemble data on the market for steel, or observe the behaviour of traders. The modern economist is the clinician with no patients, the engineer with no projects. And since these economists do not appear to engage with the issues that confront real businesses and actual households, the clients do not come.
There are, nevertheless, many well paid jobs for economists outside academia. Not, any more, in industrial and commercial companies, which have mostly decided economists are of no use to them. Business economists work in financial institutions, which principally use them to entertain their clients at lunch or advertise their banks in fillers on CNBC. Economic consulting employs economists who write lobbying documents addressed to other economists in government or regulatory agencies.
The mutual disdain between economists and practical people is not a result of practical people not being interested in economic issues – they are obsessed with them. Frustrated, they base their macroeconomic views on rudimentary inductive reasoning, as in the attempts to find elementary patterns in data - will the recession be V-shaped, or L-shaped, or double dip? Freakonomics,[9] which applies simple analytic thinking to everyday problems, has been a best seller for years. Elegantly labeled ideas that resonate with recent experience – the Minsky moment, the tipping point,[10] the Black Swan[11] – are enthusiastically absorbed into popular discourse.
If much of the modern research agenda of the economics profession is thus unconnected to the everyday world of business and finance, this is also largely true of what is taught to students. Most people finishing an undergraduate course today would not be equipped to read the Financial Times. They could import data on GDP and consumer prices into a statistical package, and would have done so, but would have no idea how these numbers were derived. They would be little better equipped than the person in the street to answer questions such as ‘why were nationalised industries more efficient in France than in Britain?’, ‘why is a school teacher in Switzerland paid much more than an Indian one?’, or the oldest of examination chestnuts, ‘are cinema seats in London expensive because rents in London are high, or vice versa?’.
In a much mocked defence of his recent graduate school education, Kartik Athreya explains – with approval – that ‘much of my first year (PhD) homework involved writing down tedious definitions of internally consistent outcomes. Not analysing them, just defining them’.[12] Many subjects involve tedious rote acquisition of essential basic knowledge – think law or medicine – but can it really be right that the essence of advanced economic training is checking definitions of consistency?
A review of economics education two decades ago concluded that students should be taught ‘to think like economists’. But ‘thinking like an economist’ has come to be interpreted as the application of deductive reasoning based on a particular set of axioms. Another Chicago Nobel Prize winner, Gary Becker, offered the following definition: ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently form the heart of the economic approach’.[13] Becker’s Nobel citation rewards him for ‘having extended the domain of microeconomic analysis to a wide range of economic behavior.’ But such extension is not an end in itself: its value can lie only in new insights into that behaviour.
‘The economic approach’ as described by Becker is not, in itself, absurd. What is absurd is the claim to exclusivity he makes for it: a priori deduction from a particular set of unrealistic simplifying assumptions is not just a tool but ‘the heart of the economic approach’. A demand for universality is added to the requirements of consistency and rigour. Believing that economics is like they suppose physics to be – not necessarily correctly – economists like Becker regard a valid scientific theory as a representation of the truth – a description of the world that is independent of time, place, context, or the observer. That is what Prescott has in mind in insisting on the term ‘aggregate economics’ instead of macroeconomics – there is, he explains, only economics.
The further demand for universality with the consistency assumption leads to the hypothesis of rational expectations and a range of arguments grouped under the rubric of ‘the Lucas critique’. If there were to be such a universal model of the economic world, economic agents would have to behave as if they had knowledge of it, or at least as much knowledge of it as was available, otherwise their optimising behaviour be inconsistent with the predictions of the model. This is a reductio ad absurdum argument, which demonstrates the impossibility of any universal model – since the implications of the conclusion for everyday behaviour are preposterous, the assumption of model universality is false.
But this is not how the argument has been interpreted. Since the followers of this approach believe strongly in the premise – to deny that there is a single pre-specified model that determines the evolution of economic series would, as they see it, be to deny that there could be a science of economics – they accept the conclusion that expectations are formed by a process consistent with general knowledge of that model. It is by no means the first time that people blinded by faith or ideology have pursued false premises to absurd conclusions – and, like their religious and political predecessors, come to believe that those who disagree are driven by ‘woeful ignorance or intentional disregard’.
This is not science, however, but its opposite. Properly conducted science is always provisional, and open to revision in the light of new data or experience: but much of modern macroeconomics tortures data to demonstrate consistency with an a priori world view or elaborates the definition of rationality to render it consistent with any observed behaviour.
The fallacy here is well described by Donald Davidson:
‘It is perhaps natural to think there is a unique way of describing things which gets at their essential nature, ‘an interpretation of the world which gets it right’, and, a description of “Reality As It Is In Itself”. Of course there is no such unique “interpretation” or description, not even in the one or more languages each of us commands, not in any possible language. Or perhaps we should just say this is an idea of which no-one has made good sense.’ [14]
And economists have not made good sense of it either, though they have been persistent in trying.
Economic models are no more, or less, than potentially illuminating abstractions. Another philosopher, Alfred Korzybski, puts the issue more briefly: ‘the map is not the territory’.[15] Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic.
This is true for analysis of the financial market crisis of 2008. Lucas’s assertion that ‘no one could have predicted it’ contains an important, though partial, insight. There can be no objective basis for a prediction of the kind ‘Lehman Bros will go into liquidation on September 15’, because if there were, people would act on that expectation and, most likely, Lehman would go into liquidation straight away. The economic world, far more than the physical world, is influenced by our beliefs about it.
Such thinking leads, as Lucas explains, directly to the efficient market hypothesis – available knowledge is already incorporated in the price of securities. And there is a substantial amount of truth in this – the growth prospects of Apple and Google, the problems of Greece and the Eurozone, are all reflected in the prices of shares, bonds and currencies. The efficient market hypothesis is an illuminating idea, but it is not “Reality As It Is In Itself”. Information is reflected in prices, but not necessarily accurately, or completely. There are wide differences in understanding and belief, and different perceptions of a future that can be at best dimly perceived.
In his Economist response, Lucas acknowledges that ‘exceptions and anomalies’ to the efficient market hypothesis have been discovered, ‘but for the purposes of macroeconomic analyses and forecasts they are too small to matter’. But how could anyone know, in advance not just of this crisis but also of any future crisis, that exceptions and anomalies to the efficient market hypothesis are ‘too small to matter’?
You can learn a great deal about deviations from the efficient market hypothesis, and the role they played in the recent financial crisis, from journalistic descriptions by people like Michael Lewis[16] and Greg Zuckerman,[17] who describe the activities of some individuals who did predict it. The large volume of such material that has appeared suggests many avenues of understanding that might be explored. You could develop models in which some trading agents have incentives aligned with those of the investors who finance them and others do not. You might describe how prices are the product of a clash between competing narratives about the world. You might appreciate the natural human reactions that made it difficult to hold short positions when they returned losses quarter after quarter.
This pragmatic thinking, employing many tools, is a better means of understanding economic phenomena than ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently’ – and to the exclusion of any other ‘ad hoc’ approach. More eclectic analysis would require not just deductive logic but also an understanding of processes of belief formation, anthropology, psychology and organisational behaviour, and meticulous observation of what people, businesses, and governments actually do. You could learn nothing about how these things influence prices if you started with the proposition that deviations from a specific theory of price determination are ‘too small to matter’ because all that is knowable is already known and therefore ‘in the price’. And that is why today’s students do, in fact, learn nothing about these things, except perhaps from extra-curricular reading.
What Lucas means when he asserts that deviations are ‘too small to matter’ is that attempts to construct general models of deviations from the efficient market hypothesis – by specifying mechanical trading rules or by writing equations to identify bubbles in asset prices – have not met with much success. But this is to miss the point: the expert billiard player plays a nearly perfect game,[18] but it is the imperfections of play between experts that determine the result. There is a – trivial – sense in which the deviations from efficient markets are too small to matter – and a more important sense in which these deviations are the principal thing that matters.
The claim that most profit opportunities in business or in securities markets have been taken is justified. But it is the search for the profit opportunities that have not been taken that drives business forward, the belief that profit opportunities that have not been arbitraged away still exist that explains why there is so much trade in securities. Far from being ‘too small to matter’, these deviations from efficient market assumptions, not necessarily large, are the dynamic of the capitalist economy.
Such anomalies are idiosyncratic and cannot, by their very nature, be derived as logical deductions from an axiomatic system. The distinguishing characteristic of Henry Ford or Steve Jobs, Warren Buffett or George Soros, is that their behaviour cannot be predicted from any prespecified model. If the behaviour of these individuals could be predicted in this way, they would not have been either innovative or rich. But the consequences are plainly not ‘too small to matter’.
The preposterous claim that deviations from market efficiency were not only irrelevant to the recent crisis but could never be relevant is the product of an environment in which deduction has driven out induction and ideology has taken over from observation. The belief that models are not just useful tools but also are capable of yielding comprehensive and universal descriptions of the world has blinded its proponents to realities that have been staring them in the face. That blindness was an element in our present crisis, and conditions our still ineffectual responses. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.
[1] Lucas, R. – http://www.economist.com/node/14165405
The reputation of economics and economists, never high, has been a victim of the crash of 2008. The Queen was hardly alone in asking why no one had predicted it. An even more serious criticism is that the economic policy debate that followed seems only to replay the similar debate after 1929. The issue is budgetary austerity versus fiscal stimulus, and the positions of the protagonists are entirely predictable from their previous political allegiances.
The doyen of modern macroeconomics, Robert Lucas, responded to the Queen’s question in a guest article in The Economist in August 2009.[1] The crisis was not predicted, he explained, because economic theory predicts that such events cannot be predicted. Faced with such a response, a wise sovereign will seek counsel elsewhere.
But not from the principal associates of Lucas, who are even less apologetic. Edward Prescott, like Lucas, a Nobel Prize winner, began a recent address to a gathering of Laureates by announcing ‘this is a great time in aggregate economics’. Thomas Sargent, whose role in developing Lucas’s ideas has been decisive, is more robust still.[2] Sargent observes that criticisms such as Her Majesty’s ‘reflect either woeful ignorance or intentional disregard of what modern macroeconomics is about’. ‘Off with his head’, perhaps. But before dismissing such responses as ridiculous, consider why these economists thought them appropriate.
In his lecture on the award of the Nobel Prize for Economics in 1995,[3] Lucas described his seminal model. That model developed into the dominant approach to macroeconomics today, now called dynamic stochastic general equilibrium. In that paper, Lucas makes (among others) the following assumptions: everyone lives for two periods, of equal length, and works for one and spends in another; there is only one good, and no possibility of storage of that good, or of investment; there is only one homogenous kind of labour; there is no mechanism of family support between older and younger generations. And so on.
All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.
But Lucas and those who follow him were plainly engaged in a very different exercise, as the philosopher Nancy Cartwright has explained.[4] The distinguishing characteristic of their approach is that the list of unrealistic simplifying assumptions is extremely long. Lucas was explicit about his objective[5] – ‘the construction of a mechanical artificial world populated by interacting robots that economics typically studies’. An economic theory, he explains, is something that ‘can be put on a computer and run’. Lucas has called structures like these ‘analogue economies’, because they are, in a sense, complete economic systems. They loosely resemble the world, but a world so pared down that everything about them is either known, or can be made up. Such models are akin to Tolkien’s Middle Earth, or a computer game like Grand Theft Auto.
The knowledge that every problem has an answer, even and perhaps especially if that answer may be difficult to find, meets a deeply felt human need. For that reason, many people become obsessive about artificial worlds, such as computer games, in which they can see the connection between actions and outcomes. Many economists who pursue these approaches are similarly asocial. It is probably no accident that economics is by far the most male of the social sciences.
One might learn skills or acquire useful ideas through playing these games, and some users do. If the compilers are good at their job, as of course they are, the sound effects, events, and outcomes of a computer game resemble those we hear and see – they can, in a phrase that Lucas and his colleagues have popularised, be calibrated against the real world. But that correspondence does not, in any other sense, validate the model. The nature of such self-contained systems is that successful strategies are the product of the assumptions made by the authors. It obviously cannot be inferred that policies that work in Grand Theft Auto are appropriate policies for governments and businesses.
Yet this correspondence does seem to be what the proponents of this approach hope to achieve – and even claim they have achieved. The debate on austerity versus stimulus, in academic circles, is in large part a debate about the validity of a property called Ricardian equivalence, which is observed in this type of model. If government engages in fiscal stimulus by spending more or by reducing taxes, people will realise that such a policy means higher taxes or lower spending in future. Even if they seem to be better off today, they will be poorer in future, and by a similar amount. Anticipating this, they will cut back and government spending will crowd out private spending. Fiscal policy is therefore ineffective as a means of responding to economic dislocation.
In a more extended defence of the DSGE approach, John Cochrane, Lucas’s Chicago colleague, puts forward the policy ineffectiveness thesis – immediately acknowledging that the assumptions that give rise to it ‘are, as usual, obviously not true’.[6] For most people, that might seem to be the end of the matter. But it isn’t. Cochrane goes on to say that ‘if you want to understand the effects of government spending, you have to specify why the assumptions leading to Ricardian equivalence are false’. That is a reasonable demand, though one that is easy to satisfy – as Cochrane himself readily acknowledges.
But Cochrane will not give up so easily. He goes on; ‘economists have spent a generation tossing and turning the Ricardian equivalence theory and assessing the likely effects of fiscal stimulus in its light, generalising the “ifs” and figuring out the likely “therefores”. This is exactly the right way to do things’. The programme Cochrane describes modifies the core model in a rather mechanical way that makes it more complex, but not necessarily more realistic, by introducing additional parameters that have labels such as ‘frictions’ or ‘transactions costs’ – in much the same way as a game compiler might introduce a new module or sound effect.
Why is this ‘exactly the right way to do things’? There are at least two alternative ways to proceed. You could build a different analogue economy. Joe Stiglitz, for example, favours a model that retains many of Lucas’s assumptions, but gives critical importance to imperfections of information.[7] After all, Ricardian equivalence requires that households have a great deal of information about future budgetary options, or at least behave as if they did. A more radical modification might be an agent-based model, for example, which assumes households respond routinely to events according to specific behavioural rules. Such models can also ‘be put on a computer and run’. It is not obvious in advance – or, generally, in retrospect – whether the assumptions, or conclusions, of these models are more, or less, plausible than those of the kind of model favoured by Lucas and Cochrane.
But another approach would discard altogether the idea that the economic world can be described by a universally applicable model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but which cannot be described fully, or perhaps at all, by the kinds of variables and equations with which economists are familiar. Models, when employed, must therefore be context specific, in the manner suggested in a recent book by Roman Frydman and Michael Goldberg.[8]
In that eclectic world, Ricardian equivalence is no more than a suggestive hypothesis. It is possible that some such effect exists. One might be sceptical about whether it is very large, and suspect its size depends on a range of confounding and contingent factors – the nature of the stimulus, the overall political situation, the nature of financial markets and welfare systems. This is what the generation of economists who followed Keynes did when they estimated a consumption function – they tried to measure how much of a fiscal stimulus was spent – and the ‘multiplier’ that resulted.
But you would not nowadays be able to publish similar articles in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. You might be accused of the cardinal sin of being ‘ad hoc’. Rigour and consistency are the two most powerful words in economics today.
They have undeniable virtues, but for economists they have particular interpretations. Consistency means that any statement about the world must be made in the light of a comprehensive descriptive theory of the world. Rigour means that the only valid claims are logical deductions from specified assumptions. Consistency is therefore an invitation to ideology, rigour an invitation to mathematics. This curious combination of ideology and mathematics is the hallmark of what is often called ‘freshwater economics’ – the name reflecting the proximity of Chicago, and other centres such as Minneapolis and Rochester, to the Great Lakes.
Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are complete artificial worlds, like those of Grand Theft Auto, which can ‘be put on a computer and run’.
For many people, deductive reasoning is the mark of science, while induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty’, says Cochrane, ‘is logical consistency’. It seems impossible that anyone acquainted with great human achievements – whether in the arts, the humanities or the sciences – could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso – or Newton or Darwin – approached their task.
The issue is therefore not mathematics versus poetry. Deductive reasoning of any kind necessarily draws on mathematics and formal logic; inductive reasoning is based on experience and above all on careful observation and may, or may not, make use of statistics and mathematics. Much scientific progress has been inductive: empirical regularities are observed in advance of any clear understanding of the mechanisms that give rise to them. This is true even of hard sciences such as physics, and more true of applied disciplines such as medicine or engineering. Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word. With exquisite irony, Lucas holds a chair named for John Dewey, the theorist of American pragmatism.
Engineers and doctors can perhaps be criticised for attaching too much weight to their own experience and personal observations. They are often sceptical, not just of theory, but of data they have not themselves collected. In contrast, most modern economists make no personal observations at all. Empirical work in economics, of which there is a great deal, predominantly consists of the statistical analysis of large data sets compiled by other people.
Few modern economists would, for example, monitor the behaviour of Procter and Gamble, assemble data on the market for steel, or observe the behaviour of traders. The modern economist is the clinician with no patients, the engineer with no projects. And since these economists do not appear to engage with the issues that confront real businesses and actual households, the clients do not come.
There are, nevertheless, many well paid jobs for economists outside academia. Not, any more, in industrial and commercial companies, which have mostly decided economists are of no use to them. Business economists work in financial institutions, which principally use them to entertain their clients at lunch or advertise their banks in fillers on CNBC. Economic consulting employs economists who write lobbying documents addressed to other economists in government or regulatory agencies.
The mutual disdain between economists and practical people is not a result of practical people not being interested in economic issues – they are obsessed with them. Frustrated, they base their macroeconomic views on rudimentary inductive reasoning, as in the attempts to find elementary patterns in data - will the recession be V-shaped, or L-shaped, or double dip? Freakonomics,[9] which applies simple analytic thinking to everyday problems, has been a best seller for years. Elegantly labeled ideas that resonate with recent experience – the Minsky moment, the tipping point,[10] the Black Swan[11] – are enthusiastically absorbed into popular discourse.
If much of the modern research agenda of the economics profession is thus unconnected to the everyday world of business and finance, this is also largely true of what is taught to students. Most people finishing an undergraduate course today would not be equipped to read the Financial Times. They could import data on GDP and consumer prices into a statistical package, and would have done so, but would have no idea how these numbers were derived. They would be little better equipped than the person in the street to answer questions such as ‘why were nationalised industries more efficient in France than in Britain?’, ‘why is a school teacher in Switzerland paid much more than an Indian one?’, or the oldest of examination chestnuts, ‘are cinema seats in London expensive because rents in London are high, or vice versa?’.
In a much mocked defence of his recent graduate school education, Kartik Athreya explains – with approval – that ‘much of my first year (PhD) homework involved writing down tedious definitions of internally consistent outcomes. Not analysing them, just defining them’.[12] Many subjects involve tedious rote acquisition of essential basic knowledge – think law or medicine – but can it really be right that the essence of advanced economic training is checking definitions of consistency?
A review of economics education two decades ago concluded that students should be taught ‘to think like economists’. But ‘thinking like an economist’ has come to be interpreted as the application of deductive reasoning based on a particular set of axioms. Another Chicago Nobel Prize winner, Gary Becker, offered the following definition: ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently form the heart of the economic approach’.[13] Becker’s Nobel citation rewards him for ‘having extended the domain of microeconomic analysis to a wide range of economic behavior.’ But such extension is not an end in itself: its value can lie only in new insights into that behaviour.
‘The economic approach’ as described by Becker is not, in itself, absurd. What is absurd is the claim to exclusivity he makes for it: a priori deduction from a particular set of unrealistic simplifying assumptions is not just a tool but ‘the heart of the economic approach’. A demand for universality is added to the requirements of consistency and rigour. Believing that economics is like they suppose physics to be – not necessarily correctly – economists like Becker regard a valid scientific theory as a representation of the truth – a description of the world that is independent of time, place, context, or the observer. That is what Prescott has in mind in insisting on the term ‘aggregate economics’ instead of macroeconomics – there is, he explains, only economics.
The further demand for universality with the consistency assumption leads to the hypothesis of rational expectations and a range of arguments grouped under the rubric of ‘the Lucas critique’. If there were to be such a universal model of the economic world, economic agents would have to behave as if they had knowledge of it, or at least as much knowledge of it as was available, otherwise their optimising behaviour be inconsistent with the predictions of the model. This is a reductio ad absurdum argument, which demonstrates the impossibility of any universal model – since the implications of the conclusion for everyday behaviour are preposterous, the assumption of model universality is false.
But this is not how the argument has been interpreted. Since the followers of this approach believe strongly in the premise – to deny that there is a single pre-specified model that determines the evolution of economic series would, as they see it, be to deny that there could be a science of economics – they accept the conclusion that expectations are formed by a process consistent with general knowledge of that model. It is by no means the first time that people blinded by faith or ideology have pursued false premises to absurd conclusions – and, like their religious and political predecessors, come to believe that those who disagree are driven by ‘woeful ignorance or intentional disregard’.
This is not science, however, but its opposite. Properly conducted science is always provisional, and open to revision in the light of new data or experience: but much of modern macroeconomics tortures data to demonstrate consistency with an a priori world view or elaborates the definition of rationality to render it consistent with any observed behaviour.
The fallacy here is well described by Donald Davidson:
‘It is perhaps natural to think there is a unique way of describing things which gets at their essential nature, ‘an interpretation of the world which gets it right’, and, a description of “Reality As It Is In Itself”. Of course there is no such unique “interpretation” or description, not even in the one or more languages each of us commands, not in any possible language. Or perhaps we should just say this is an idea of which no-one has made good sense.’ [14]
And economists have not made good sense of it either, though they have been persistent in trying.
Economic models are no more, or less, than potentially illuminating abstractions. Another philosopher, Alfred Korzybski, puts the issue more briefly: ‘the map is not the territory’.[15] Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic.
This is true for analysis of the financial market crisis of 2008. Lucas’s assertion that ‘no one could have predicted it’ contains an important, though partial, insight. There can be no objective basis for a prediction of the kind ‘Lehman Bros will go into liquidation on September 15’, because if there were, people would act on that expectation and, most likely, Lehman would go into liquidation straight away. The economic world, far more than the physical world, is influenced by our beliefs about it.
Such thinking leads, as Lucas explains, directly to the efficient market hypothesis – available knowledge is already incorporated in the price of securities. And there is a substantial amount of truth in this – the growth prospects of Apple and Google, the problems of Greece and the Eurozone, are all reflected in the prices of shares, bonds and currencies. The efficient market hypothesis is an illuminating idea, but it is not “Reality As It Is In Itself”. Information is reflected in prices, but not necessarily accurately, or completely. There are wide differences in understanding and belief, and different perceptions of a future that can be at best dimly perceived.
In his Economist response, Lucas acknowledges that ‘exceptions and anomalies’ to the efficient market hypothesis have been discovered, ‘but for the purposes of macroeconomic analyses and forecasts they are too small to matter’. But how could anyone know, in advance not just of this crisis but also of any future crisis, that exceptions and anomalies to the efficient market hypothesis are ‘too small to matter’?
You can learn a great deal about deviations from the efficient market hypothesis, and the role they played in the recent financial crisis, from journalistic descriptions by people like Michael Lewis[16] and Greg Zuckerman,[17] who describe the activities of some individuals who did predict it. The large volume of such material that has appeared suggests many avenues of understanding that might be explored. You could develop models in which some trading agents have incentives aligned with those of the investors who finance them and others do not. You might describe how prices are the product of a clash between competing narratives about the world. You might appreciate the natural human reactions that made it difficult to hold short positions when they returned losses quarter after quarter.
This pragmatic thinking, employing many tools, is a better means of understanding economic phenomena than ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently’ – and to the exclusion of any other ‘ad hoc’ approach. More eclectic analysis would require not just deductive logic but also an understanding of processes of belief formation, anthropology, psychology and organisational behaviour, and meticulous observation of what people, businesses, and governments actually do. You could learn nothing about how these things influence prices if you started with the proposition that deviations from a specific theory of price determination are ‘too small to matter’ because all that is knowable is already known and therefore ‘in the price’. And that is why today’s students do, in fact, learn nothing about these things, except perhaps from extra-curricular reading.
What Lucas means when he asserts that deviations are ‘too small to matter’ is that attempts to construct general models of deviations from the efficient market hypothesis – by specifying mechanical trading rules or by writing equations to identify bubbles in asset prices – have not met with much success. But this is to miss the point: the expert billiard player plays a nearly perfect game,[18] but it is the imperfections of play between experts that determine the result. There is a – trivial – sense in which the deviations from efficient markets are too small to matter – and a more important sense in which these deviations are the principal thing that matters.
The claim that most profit opportunities in business or in securities markets have been taken is justified. But it is the search for the profit opportunities that have not been taken that drives business forward, the belief that profit opportunities that have not been arbitraged away still exist that explains why there is so much trade in securities. Far from being ‘too small to matter’, these deviations from efficient market assumptions, not necessarily large, are the dynamic of the capitalist economy.
Such anomalies are idiosyncratic and cannot, by their very nature, be derived as logical deductions from an axiomatic system. The distinguishing characteristic of Henry Ford or Steve Jobs, Warren Buffett or George Soros, is that their behaviour cannot be predicted from any prespecified model. If the behaviour of these individuals could be predicted in this way, they would not have been either innovative or rich. But the consequences are plainly not ‘too small to matter’.
The preposterous claim that deviations from market efficiency were not only irrelevant to the recent crisis but could never be relevant is the product of an environment in which deduction has driven out induction and ideology has taken over from observation. The belief that models are not just useful tools but also are capable of yielding comprehensive and universal descriptions of the world has blinded its proponents to realities that have been staring them in the face. That blindness was an element in our present crisis, and conditions our still ineffectual responses. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.
[1] Lucas, R. – http://www.economist.com/node/14165405
[4] Cartwright, N. – http://www.amazon.com/Hunting-Causes-Using-Them-Approaches/dp/052167798X
[6] Cochrane, J. – http://faculty.chicagobooth.edu/john.cochrane/research/Papers/krugman_response.htm
[7] Stiglitz, J. – http://econpapers.repec.org/RePEc:tpr:qjecon:v:90:y:1976:i:4:p:630-49
http://www.jstor.org/pss/2296899
http://econpapers.repec.org/RePEc:aea:aecrev:v:71:y:1981:i:3:p:393-410
http://www.jstor.org/pss/2296899
http://econpapers.repec.org/RePEc:aea:aecrev:v:71:y:1981:i:3:p:393-410
[8] Frydman, R and M. Goldberg – http://www.amazon.com/Imperfect-Knowledge-Economics-Exchange-Rates/dp/0691121605
[9] Levitt, S. and J.S. Dubner – http://www.amazon.com/Freakonomics-Revised-Expanded-Economist-Everything/dp/0061234001
[10] Gladwell, M. – http://www.amazon.com/Tipping-Point-Little-Things-Difference/dp/0316346624
[11] Taleb, N. N. – http://www.amazon.com/Black-Swan-Improbable-Robustness-Fragility/dp/081297381X
[12] Athreya, K. – http://www.scribd.com/doc/33655771/Economics-is-Hard
[13] Becker, G. – http://www.amazon.com/Economic-Approach-Human-Behavior/dp/0226041123
[14] Davidson, D. – http://www.amazon.com/Rorty-His-Critics-Philosophers-their/dp/0631209824
[15] Korzybski, A. – “A Non-Aristotelian System and its Necessity for Rigour in Mathematics and Physics”
[17] Zuckerman, G. – http://www.amazon.com/Greatest-Trade-Ever-Behind—Scenes/dp/0385529945
[18] The example famously used by Friedman and Savage, 1948 – http://www.jstor.org/pss/1826045
No comments:
Post a Comment