Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published July 2016 | Submitted
Book Section - Chapter Open

The Possibilities and Limitations of Private Prediction Markets

Abstract

We consider the design of private prediction markets, financial markets designed to elicit predictions about uncertain events without revealing too much information about market participants' actions or beliefs. Our goal is to design market mechanisms in which participants' trades or wagers influence the market's behavior in a way that leads to accurate predictions, yet no single participant has too much influence over what others are able to observe. We study the possibilities and limitations of such mechanisms using tools from differential privacy. We begin by designing a private one-shot wagering mechanism in which bettors specify a belief about the likelihood of a future event and a corresponding monetary wager. Wagers are redistributed among bettors in a way that more highly rewards those with accurate predictions. We provide a class of wagering mechanisms that are guaranteed to satisfy truthfulness, budget balance on expectation, and other desirable properties while additionally guaranteeing epsilon-joint differential privacy in the bettors' reported beliefs, and analyze the trade-off between the achievable level of privacy and the sensitivity of a bettor's payment to her own report. We then ask whether it is possible to obtain privacy in dynamic prediction markets, focusing our attention on the popular cost-function framework in which securities with payments linked to future events are bought and sold by an automated market maker. We show that under general conditions, it is impossible for such a market maker to simultaneously achieve bounded worst-case loss and epsilon-differential privacy without allowing the privacy guarantee to degrade extremely quickly as the number of trades grows, making such markets impractical in settings in which privacy is valued. We conclude by suggesting several avenues for potentially circumventing this lower bound.

Additional Information

© 2016 ACM. A full version of this paper which includes the appendix is available online at http://arxiv.org/abs/1602.07362. The first author was supported in part by a Simons Award for Graduate Students in Theoretical Computer Science, NSF grant CNS-1254169, and US-Israel Binational Science Foundation grant 2012348. Much of this research was done while R. Cummings was at Microsoft Research.

Attached Files

Submitted - 1602.07362v1.pdf

Files

1602.07362v1.pdf
Files (534.4 kB)
Name Size Download all
md5:533d6aff99d007cbc56bc32fed017993
534.4 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023