Professor of Economics • Curriculum vitae
Division of the Humanities and Social Sciences
California Institute of Technology
1200 East California Boulevard,
Pasadena, California, 91125
Microeconomic theory. Economics of risk and uncertainty. Theories of information.
Strategic forecasting. Bayesian and interactive epistemology.
The expectation is an example of a descriptive statistic that is monotone with respect to stochastic dominance, and additive for sums of independent random variables. We provide a complete characterization of such statistics, and explore a number of applications to models of individual and group decision-making. These include a representation of stationary, monotone time preferences, extending the work of Fishburn and Rubinstein (1982) to time lotteries, as well as a characterization of risk-averse preferences over monetary gambles that are invariant to mean-zero background risks.
We show that under plausible levels of background risk, no theory of choice under risk---such as expected utility theory, prospect theory, or rank dependent utility---can simultaneously satisfy the following three economic postulates: (i) Decision makers are risk-averse over small gambles, (ii) they respect stochastic dominance, and (iii) they account for background risk.
We develop an axiomatic theory of costly information acquisition. Our axioms capture the idea of constant marginal costs in information production: the cost of generating two independent signals is the sum of their costs, and the cost of generating a signal with probability half equals half the cost of generating it deterministically. Together with a monotonicity and a continuity conditions, these axioms completely determine the cost of a signal up to a vector of parameters, one for each pair of states of nature. These parameters have a clear economic interpretation and determine the difficulty of distinguishing between different states. The resulting cost function, which we call log-likelihood ratio cost, is a linear combinations of the Kullback-Leibler divergences (i.e., the expected log-likelihood ratios) between the conditional signal distributions. We argue that this cost function is a versatile modeling tool, and that in various examples of information acquisition it leads to more realistic predictions than the approach based on Shannon entropy.