The Socratic Method, an early example of an adversarial method.

Adversarial Estimators

This paper develops the statistical theory (i.e. derives convergence rates and asymptotic normality) of adversarial estimators, which estimate quantities of interest by pitting two models against each other. The paper proves traditional statistical guarantees even for semiparametric settings involving deep neural networks, and points out interesting information-theoretic connections between various estimators: from traditional econometric methods such as Empirical Likelihood, over Generative AI methods such as GANs, to modern nonparametric causal inference estimators.

January 2023 · Jonas Metzger
WGANs can capture economic data well, pretty much out of the box

Using Wasserstein Generative Adversarial Networks for the design of Monte Carlo simulations

Objective benchmarks based on real-world datasets have been crucial for methodological progress in machine learning. These do not exist in causal inference, which aims to predict counterfactuals that are not observed in the data. Here, we show that GANs can be used to learn data generating processes which closely resemble real data in terms of observables, while containing the unobserved counterfactuals necessary to evaluate causal inference methods. We show how causal inference practitioners can use this approach to evaluate methodological progress, and to select appropriate methods for specific data sets.

March 2021 · Susan Athey, Guido Imbens, Jonas Metzger, Evan Munro