Luciano Pomatto


Assistant Professor • Curriculum vitae
Division of the Humanities and Social Sciences
California Institute of Technology

1200 East California Boulevard,
Pasadena, California, 91125
E-mail: luciano@caltech.edu

Teaching:

CS/SS/Ec149: Introduction to Algorithmic Econ.
Ec117: Matching Markets
SS201c: Analytical Foundations of Social Sciences
SS211c: Advanced Economic Theory
Two lectures on strategic forecasting: [1][2]

Research interests:
Microeconomic theory. Economics of risk and uncertainty. Theories of Information.
Strategic forecasting. Bayesian and interactive epistemology.



Working Papers:
  • Model and Predictive Uncertainty: A Foundation for Smooth Ambiguity Preferences     November 2019
    with Tommaso Denti.

    Smooth ambiguity preferences (Klibanoff, Marinacci, and Mukerji, 2005) describe a decision maker who evaluates each act according to a twofold expectation defined by a utility function, an ambiguity index , and a belief over a set of probabilities. We revisit the logic behind this well known representation. We interpret the set of probabilities as a subjective statistical model, and posit that according to the decision maker it is point identified. Our main result is an axiomatic foundation for this representation within the standard Anscombe-Aumann framework. The result is based on a joint weakening of the Savage and the Anscombe-Aumann axioms. Finally, we extend the analysis to statistical models that are partially identified, in order to capture ambiguity about unknowables.

  • Blackwell Dominance in Large Samples   July 2019
    with Xiaosheng Mu, Philipp Strack, and Omer Tamuz.

    We study repeated independent Blackwell experiments. Standard examples include drawing multiple samples from a population, or performing a measurement in different locations. In the baseline setting of a binary state of nature, we compare experiments in terms of their informativeness in large samples. Addressing a question due to Blackwell (1951) we show that generically, an experiment is more informative than another in large samples if and only if it has higher Rényi divergences.

  • The Cost of Information   February 2019
    with Philipp Strack and Omer Tamuz.

    We develop an axiomatic theory of costly information acquisition. Our axioms capture the idea of constant marginal costs in information production: the cost of generating two independent signals is the sum of their costs, and the cost of generating a signal with probability half equals half the cost of generating it deterministically. Together with a monotonicity and a continuity conditions, these axioms completely determine the cost of a signal up to a vector of parameters, one for each pair of states of nature. These parameters have a clear economic interpretation and determine the difficulty of distinguishing between different states. The resulting cost function, which we call log-likelihood ratio cost, is a linear combinations of the Kullback-Leibler divergences (i.e., the expected log-likelihood ratios) between the conditional signal distributions. We argue that this cost function is a versatile modeling tool, and that in various examples of information acquisition it leads to more realistic predictions than the approach based on Shannon entropy.

  • Testable Forecasts   September 2018

    Predictions about the future are often evaluated through statistical tests. As shown by recent literature, many known tests are subject to adverse selection problems and are ineffective at discriminating between forecasters who are competent and forecasters who are uninformed but predict strategically. This paper presents necessary and sufficient conditions under which it is possible to discriminate between informed and uninformed forecasters. It is shown that optimal tests take the form of likelihood-ratio tests comparing forecasters’ predictions against the predictions of a hypothetical Bayesian outside observer. The paper also illustrates a novel connection between the problem of testing strategic forecasters and the classical Neyman-Pearson paradigm of hypothesis testing.

  • Stable Matching under Forward-Induction Reasoning   June 2019

    A standing question in the theory of matching markets is how to define stability under incomplete information. The crucial obstacle is that a notion of stability must include a theory of how beliefs are updated in a blocking pair. This paper proposes a novel epistemic approach. Agents negotiate through offers. Offers are interpreted according to the highest possible degree of rationality that can be ascribed to their proponents, in line with the principle of forward-induction reasoning. This approach leads to a new definition of stability. The main result shows an equivalence between this notion and “incomplete-information stability,” a cooperative solution concept recently put forward by Liu, Mailath, Postlewaite and Samuelson (2014). The result implies that forward-induction reasoning leads to efficient matchings under standard supermodularity conditions.

  • Aggregate Risk and the Pareto Principle   2017, with Nabil Al-Najjar

    A crucial distinction in the evaluation of public policies is between plans that involve purely idiosyncratic risk and plans that generate aggregate, correlated risk. While elementary, such a dichotomy is not captured by standard utilitarian aggregators. In this paper we revisit Harsanyi (1955) celebrated theory of preferences aggregation and develop a parsimonious generalization of utilitarianism. The theory we propose can capture sensitivity to aggregated risk, it is apt for studying large populations and is characterized by two simple axioms of preferences aggregation.



Publications and Forthcoming Papers: