An interesting reading to kick off the semester warns of "the dangers of presuming a precision and degree of knowledge we do not have." The article, by Ricardo Caballero, is called "Macroeconomics after the Crisis: Time to Deal with the Pretense-of-Knowledge Syndrome," and appeared in the Fall 2010 Journal of Economic Perspectives.
Caballero, who is an MIT professor, writes that "On the methodological front, macroeconomic research has been in 'fine-tuning' mode within the local-maximum of the dynamic stochastic general equilibrium world, when we should be in 'broad-exploration' mode."
Basically, the general approach to macroeconomics these days is to begin with a stochastic neoclassical growth model-- a basic model of households and firms that everyone learns at the beginning of grad school-- and tack on some special effects, like money, monopolistic competition, or nominal rigidities. The problem is that "by some strange herding process the core of macroeconomics seems to transform things that may have been useful modeling short-cuts into a part of a new and artificial 'reality,' and now suddenly everyone uses the same language, which in the next iteration gets confused with, and eventually replaces, reality.
Along the way, this process of make-believe substitution raises our presumption of
knowledge about the workings of a complex economy."
We try to make incremental improvements to the model, adding a parameter here and there, but actually bring it further and further from reality by compounding absurd assumptions. "In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic and policy work. In trying to add a degree of complexity to the current core models, by
bringing in aspects of the periphery, we are simultaneously making the rationality
assumptions behind that core approach less plausible."
Caballero makes a distinction between uncertainty and Knightian uncertainty. The latter involves risk that cannot be measured and hence cannot be hedged--things you'd never think of thinking of. This is the kind of uncertainty that is involved in most crises and panics. In one sense, it is very discouraging that novelty and surprise is such a big part of financial crises, for then, how can we ever hope to model or understand them? On the other hand, there are nevertheless some insights: widespread confusion triggers panics which trigger demand for broad insurance (and a role for government). Macroeconomists need to embrace complexity and recognize that human reaction to the truly unknown is fundamentally different
from reaction to the risks associated with a known situation and environment.
Last semester, I took a course in Psychology and Economics with Matthew Rabin. The types of models we studied there captured things like belief-based or reference-dependent preferences, ego utility, social preferences, and framing and bracketing effects. All of these tweaks to the standard models are made for the sake of increased realism, but still, they are just tweaks. Usually they involve adding a nice Greek letter to the model. The standard model is a special case for which the Greek parameter is 0 or 1. Agents are allowed to have particular types of risk preferences, but the risks can always be measured, and in fact the models almost always impose rational expectations. Rational inattention is allowed, but Knightian uncertainty is not; agents are perfect Bayesian updaters with well-defined Bayesian priors. In this sense, as Professor Rabin describes in the syllabus, the course is "purposely, pointedly, persistently, proudly and ponderously mainstream."
Caballero says we need to go further from the mainstream and into areas like complex-systems theory and robust-control theory. In doing so we may need to "relax the artificial micro-foundation constraints imposed just for the sake of being able to generate 'structural' general equilibrium simulations."
This semester, I have one course that teaches precisely the methodology that Caballero so harshly criticizes. It's a theory-intensive course in macroeconomics, and the main textbook is Recursive Methods in Economic Dynamics by Stokey and Lucas. I'm excited to balance this course with a new course called Empirical Methods in Macroeconomics and Finance taught by Professor Atif Mian. This course will focus on new ways to integrate finance and macro. We will also get to learn about some new data sets and work at developing research ideas. Yet another complementary course will be Professor Barry Eichengreen's European Economic History course. Theory, empirics, and history sounds like a balanced mental diet for this semester.