8 results sorted by ID
On the Efficient Estimation of Min-Entropy
Yongjune Kim, Cyril Guyot, Young-Sik Kim
Foundations
The min-entropy is an important metric to quantify randomness of generated random numbers in cryptographic applications; it measures the difficulty of guessing the most-likely output. One of the important min-entropy estimator is the compression estimator of NIST Special Publication (SP) 800-90B, which relies on Maurer's universal test. In this paper, we propose two kinds of min-entropy estimators to improve computational complexity and estimation accuracy by leveraging two variations of...
Complexity of Estimating Renyi Entropy of Markov Chains
Maciej Obremski, Maciej Skorski
Foundations
Estimating entropy of random processes is one of the fundamental problems of
machine learning and property testing. It has numerous applications to anything
from DNA testing and predictability of human behaviour to modeling neural
activity and cryptography. We investigate the problem of Renyi entropy estimation
for sources that form Markov chains.
Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time,...
Improved Security Evaluation Techniques for Imperfect Randomness from Arbitrary Distributions
Takahiro Matsuda, Kenta Takahashi, Takao Murakami, Goichiro Hanaoka
Dodis and Yu (TCC 2013) studied how the security of cryptographic primitives that are secure in the "ideal" model in which the distribution of a randomness is the uniform distribution, is degraded when the ideal distribution of a randomness is switched to a "real-world" (possibly biased) distribution that has some lowerbound on its min-entropy or collision-entropy. However, in many constructions, their security is guaranteed only when a randomness is sampled from some non-uniform...
On Renyi Entropies and their Applications to Guessing Attacks in Cryptography
Serdar Boztas
Foundations
We consider single and multiple attacker scenarios in guessing and obtain bounds on various success parameters in terms of Renyi entropies. We also obtain a new derivation of the union bound.
Renyi Entropy Estimation Revisited
Maciej Obremski, Maciej Skorski
We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size $n$.
For estimating Renyi entropy of order $\alpha$, up to constant accuracy and error probability, we show the following
(a) Upper bounds $n = O(1) \cdot 2^{\left(1-\frac{1}{\alpha}\right)H_{\alpha}}$
for integer $\alpha>1$, as the worst case over...
Improved Estimation of Collision Entropy in High and Low-Entropy Regimes and Applications to Anomaly Detection
Maciej Skorski
Foundations
We revisit the problem of estimating Renyi Entropy from samples, focusing on the important case of collision entropy.
With $n$ samples we approximate the collision entropy of $X$ within an additive factor of $O\left( 2^{2\Delta}\log^{\frac{1}{2}}(1/\epsilon) \right)$ with probability $1-\epsilon$, where $\Delta$ is a known (a priori) upper bound on the difference between
Renyi entropies of $X$ of order 2 and 3 respectively.
In particular, we simplify and improve the previous result on...
A Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy
Maciej Skorski
Foundations
We provide a new result that links two crucial entropy notions: Shannon entropy $\mathrm{H}_1$ and collision entropy $\mathrm{H}_2$. Our formula gives the \emph{worst possible} amount of collision entropy in a probability distribution, when its Shannon entropy is fixed.
Our results and techniques used in the proof immediately imply
many quantitatively tight separations between Shannon and smooth Renyi entropy, which were previously known as qualitative statements or one-sided bounds. In...
The Spammed Code Offset Method
Boris Skoric, Niels de Vreede
Helper data schemes are a security primitive used for privacy-preserving biometric databases and Physical Unclonable Functions. One of the oldest known helper data schemes is the Code Offset Method (COM). We propose an extension of the COM: the helper data is accompanied by many instances of fake helper data that are drawn from the same distribution as the real one. While the adversary has no way to distinguish between them, the legitimate party has more information and {\em can} see the...
The min-entropy is an important metric to quantify randomness of generated random numbers in cryptographic applications; it measures the difficulty of guessing the most-likely output. One of the important min-entropy estimator is the compression estimator of NIST Special Publication (SP) 800-90B, which relies on Maurer's universal test. In this paper, we propose two kinds of min-entropy estimators to improve computational complexity and estimation accuracy by leveraging two variations of...
Estimating entropy of random processes is one of the fundamental problems of machine learning and property testing. It has numerous applications to anything from DNA testing and predictability of human behaviour to modeling neural activity and cryptography. We investigate the problem of Renyi entropy estimation for sources that form Markov chains. Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time,...
Dodis and Yu (TCC 2013) studied how the security of cryptographic primitives that are secure in the "ideal" model in which the distribution of a randomness is the uniform distribution, is degraded when the ideal distribution of a randomness is switched to a "real-world" (possibly biased) distribution that has some lowerbound on its min-entropy or collision-entropy. However, in many constructions, their security is guaranteed only when a randomness is sampled from some non-uniform...
We consider single and multiple attacker scenarios in guessing and obtain bounds on various success parameters in terms of Renyi entropies. We also obtain a new derivation of the union bound.
We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size $n$. For estimating Renyi entropy of order $\alpha$, up to constant accuracy and error probability, we show the following (a) Upper bounds $n = O(1) \cdot 2^{\left(1-\frac{1}{\alpha}\right)H_{\alpha}}$ for integer $\alpha>1$, as the worst case over...
We revisit the problem of estimating Renyi Entropy from samples, focusing on the important case of collision entropy. With $n$ samples we approximate the collision entropy of $X$ within an additive factor of $O\left( 2^{2\Delta}\log^{\frac{1}{2}}(1/\epsilon) \right)$ with probability $1-\epsilon$, where $\Delta$ is a known (a priori) upper bound on the difference between Renyi entropies of $X$ of order 2 and 3 respectively. In particular, we simplify and improve the previous result on...
We provide a new result that links two crucial entropy notions: Shannon entropy $\mathrm{H}_1$ and collision entropy $\mathrm{H}_2$. Our formula gives the \emph{worst possible} amount of collision entropy in a probability distribution, when its Shannon entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were previously known as qualitative statements or one-sided bounds. In...
Helper data schemes are a security primitive used for privacy-preserving biometric databases and Physical Unclonable Functions. One of the oldest known helper data schemes is the Code Offset Method (COM). We propose an extension of the COM: the helper data is accompanied by many instances of fake helper data that are drawn from the same distribution as the real one. While the adversary has no way to distinguish between them, the legitimate party has more information and {\em can} see the...