Paper 2023/943

Correlated-Output Differential Privacy and Applications to Dark Pools

James Hsin-yu Chiang, Aarhus University
Bernardo David, IT University of Copenhagen
Mariana Gama, imec-COSIC, KU Leuven
Christian Janos Lebeda, IT University of Copenhagen
Abstract

In the classical setting of differential privacy, a privacy-preserving query is performed on a private database, after which the query result is released to the analyst; a differentially private query ensures that the presence of a single database entry is protected from the analyst’s view. In this work, we contribute the first definitional framework for differential privacy in the trusted curator setting; clients submit private inputs to the trusted curator, which then computes individual outputs privately returned to each client. The adversary is more powerful than the standard setting; it can corrupt up to n − 1 clients and subsequently decide inputs and learn outputs of corrupted parties. In this setting, the adversary also obtains leakage from the honest output that is correlated with a corrupted output. Standard differentially private mechanisms protect client inputs but do not mitigate output correlation leaking arbitrary client information, which can forfeit client privacy completely. We initiate the investigation of a novel notion of correlated output differential privacy to bound the leakage from output correlation in the trusted curator setting. We define the satisfaction of both standard and correlated-output differential privacy as round differential privacy and highlight the relevance of this novel privacy notion to all application domains in the trusted curator model. We explore round differential privacy in traditional “dark pool” market venues, which promise privacy-preserving trade execution to mitigate front-running; privately submitted trade orders and trade execution are kept private by the trusted venue operator. We observe that dark pools satisfy neither classic nor correlated-output differential privacy; in markets with low trade activity, the adversary may trivially observe recurring, honest trading patterns, and anticipate and front-run future trades. In response, we present the first round differentially private market mechanisms that formally mitigate information leakage from all trading activity of a user. This is achieved with fuzzy order matching, inspired by the standard randomized response mechanism; however, this also introduces a liquidity mismatch as buy and sell orders are not guaranteed to execute pairwise, thereby weakening output correlation; this mismatch is compensated for by a round differentially private liquidity provider mechanism, which freezes a noisy amount of assets from the liquidity provider for the duration of a privacy epoch, but leaves trader balances unaffected. We propose oblivious algorithms for realizing our proposed market mechanisms with secure multi-party computation (MPC) and implement these in the Scale-Mamba Framework using Shamir Secret Sharing based MPC. We demonstrate practical, round differentially private trading with comparable throughput as prior work implementing (traditional) dark pool algorithms in MPC; our experiments demonstrate practicality for both traditional finance and decentralized finance settings.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Preprint.
Keywords
Differential PrivacySecure Multi-party ComputationDark PoolsDecentralized Finance
Contact author(s)
jachiang @ ucla edu
bernardo @ bmdavid com
mariana botelhodagama @ esat kuleuven be
chle @ itu dk
History
2023-08-19: revised
2023-06-16: received
See all versions
Short URL
https://ia.cr/2023/943
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2023/943,
      author = {James Hsin-yu Chiang and Bernardo David and Mariana Gama and Christian Janos Lebeda},
      title = {Correlated-Output Differential Privacy and Applications to Dark Pools},
      howpublished = {Cryptology {ePrint} Archive, Paper 2023/943},
      year = {2023},
      url = {https://eprint.iacr.org/2023/943}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.