Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published March 2020 | Submitted
Journal Article Open

Payoff information and learning in signaling games

Abstract

We add the assumption that players know their opponents' payoff functions and rationality to a model of non-equilibrium learning in signaling games. Agents are born into player roles and play against random opponents every period. Inexperienced agents are uncertain about the prevailing distribution of opponents' play, but believe that opponents never choose conditionally dominated strategies. Agents engage in active learning and update beliefs based on personal observations. Payoff information can refine or expand learning predictions, since patient young senders' experimentation incentives depend on which receiver responses they deem plausible. We show that with payoff knowledge, the limiting set of long-run learning outcomes is bounded above by rationality-compatible equilibria (RCE), and bounded below by uniform RCE. RCE refine the Intuitive Criterion (Cho and Kreps, 1987) and include all divine equilibria (Banks and Sobel, 1987). Uniform RCE sometimes but not always exists, and implies universally divine equilibrium.

Additional Information

© 2019 Elsevier Inc. Received 11 July 2019, Available online 30 December 2019. We thank Laura Doval, Glenn Ellison, Lorens Imhof, Yuichiro Kamada, Robert Kleinberg, David K. Levine, Kevin K. Li, Eric Maskin, Dilip Mookherjee, Harry Pei, Matthew Rabin, Bill Sandholm, Lones Smith, Joel Sobel, Philipp Strack, Bruno Strulovici, Tomasz Strzalecki, Jean Tirole, Juuso Toikka, and two anonymous referees for helpful comments and conversations, and National Science Foundation grant SES 1643517 for financial support.

Attached Files

Submitted - 1709.01024.pdf

Files

1709.01024.pdf
Files (779.1 kB)
Name Size Download all
md5:8ab50fad014b2672e045e2a7a057044c
779.1 kB Preview Download

Additional details

Created:
August 22, 2023
Modified:
October 18, 2023