Fisher Information Properties
Abstract
:1. Introduction
1.1. Fisher Information and Other Fields
1.2. Contribution of This Work
2. Notation
3. Fisher Information
4. Several Random Variables Depending on θk
4.1. Joint Fisher Information Definition
4.2. An Equivalent Joint Fisher Information Definition
4.3. Conditional Fisher Information Definition
4.4. Chain Rule for Two Random Variables
4.5. Chain Rule for Many Random Variables
5. Relative Fisher Information Type I
6. Information Correlation
- fX;θ and fY;θ are independent.
- Either fX;θ or fY;θ does not depend on θk.
7. Mutual Fisher Information Type I
7.1. Definition
7.2. Conditional Mutual Fisher Information of Type I
8. Relative Fisher Information Type II
9. Mutual Fisher Information Type II
10. Other Properties
10.1. Lower Bound for Fisher Information
10.2. In Some Cases, Conditioning Increases the Fisher Information
10.3. Data Processing Inequality
10.4. Upper Bound on Estimation Error
11. Discussion
A. Boundary Condition
Acknowledgments
Conflicts of Interest
References
- Shannon, C. A Mathematical Theory of Communication. Bell Syst. Tech. J 1948, 27, 379–423. [Google Scholar]
- Fisher, R. Theory of Statistical Estimation. Proc. Camb. Philos. Soc. 1925, 22, 700–725. [Google Scholar]
- Rao, C.R. Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc. 1945, 37, 81–89. [Google Scholar]
- Cramer, H. Mathematical Methods of Statistics; Princeton University Press: Princeton, NJ, USA, 1945. [Google Scholar]
- Kullback, S. Information Theory and Statistics; Dover Publications Inc.: Mineola, NY, USA, 1968. [Google Scholar]
- Blahut, R.E. Principles and Practice of Information Theory; Addison-Wesley Publishing Company: Boston, MA, USA, 1987. [Google Scholar]
- Frieden, B.R. Science from Fisher Information: A Unification; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Stam, A.J. Some mathematical properties of quantities of information. In Ph.D. Thesis; Technological University of Delft: Delft, The Netherlands, 1959. [Google Scholar]
- Stam, A.J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control. 1959, 2, 101–112. [Google Scholar]
- Cover, T.; Thomas, J. Elements of Information Theory; John Wiley and Sons, Inc: Hoboken, NJ, USA, 2006. [Google Scholar]
- Narayanan, K.R.; Srinivasa, A.R. On the Thermodynamic Temperature of a General Distribution; Cornell University Library: Ithaca, NY, USA, 2007. [Google Scholar]
- Guo, D. Relative Entropy and Score Function: New Information-Estimation Relationships through Arbitrary Additive Perturbation, Proceedings of the IEEE International Symposium on Information Theory, Seoul, Korea, 28 June–3 July 2009; pp. 814–818.
- Blachman, N.M. The Convolution Inequality for Entropy Powers. IEEE Trans. Inf. Theory 1965, 11, 267–271. [Google Scholar]
- Costa, M.H.M.; Cover, T.M. On the Similarity of the Entropy Power Inequality and the Brunn Minkowski Inequality; Technical Report; Stanford University: Stanford, CA, USA, 1983. [Google Scholar]
- Zamir, R.; Feder, M. A generalization of the entropy power inequality with applications. IEEE Trans. Inf. Theory 1993, 39, 1723–1728. [Google Scholar]
- Lutwak, E.; Yang, D.; Zhang, G. CramerâǍŞRao and Moment-Entropy Inequalities for Renyi Entropy and Generalized Fisher Information. IEEE Trans. Inf. Theory 2005, 51, 473–478. [Google Scholar]
- Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, B.H. Fisher-Based Thermodynamics: Its Legendre Transform and Concavity Properties. Phys. Rev. E 1999, 60, 48–53. [Google Scholar]
- Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, B.H. Non-equilibrium thermodynamics and Fisher information: An illustrative example. Phys. Lett. A 2002, 304, 73–78. [Google Scholar]
- Frieden, B.R.; Petri, M. Motion-dependent levels of order in a relativistic universe. Phys. Rev. E 2012, 86, 1–5. [Google Scholar]
- Frieden, B.R.; Gatenby, R.A. Principle of maximum Fisher information from Hardy’s axioms applied to statistical systems. Phys. Rev. E 2013, 88, 1–6. [Google Scholar]
- Flego, S.; Olivares, F.; Plastino, A.; Casas, M. Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations. Entropy 2011, 13, 184–194. [Google Scholar] [Green Version]
- Venkatesan, R.C.; Plastino, A. Legendre transform structure and extremal properties of the relative Fisher information. Phys. Lett. A 2014, 378, 1341–1345. [Google Scholar]
- Van Trees, H.L. Detection, Estimation, and Modulation Theory: Part 1; John Wiley and Sons, Inc: Hoboken, NJ, USA, 2001. [Google Scholar]
- Amari, S.I. Natural Gradient Works Efficiently in Learning. Neural Comput. 1998, 10, 251–276. [Google Scholar]
- Pascanu, R.; Bengio, Y. Revisiting Natural Gradient for Deep Networks; Cornell University Library: Ithaca, NY, USA, 2014; pp. 1–18. [Google Scholar]
- Luo, S. Maximum Shannon entropy, minimum Fisher information, and an elementary game. Found. Phys. 2002, 32, 1757–1772. [Google Scholar]
- Langley, R.S. Probability Functionals for Self-Consistent and Invariant Inference: Entropy and Fisher Information. IEEE Trans. Inf. Theory 2013, 59, 4397–4407. [Google Scholar]
- Zegers, P.; Fuentes, A.; Alarcon, C. Relative Entropy Derivative Bounds. Entropy 2013, 15, 2861–2873. [Google Scholar]
- Cohen, M. The Fisher Information and Convexity. IEEE Trans. Inf. Theory 1968, 14, 591–592. [Google Scholar]
- Cover, T.; Thomas, J. Elements of Information Theory; John Wiley and Sons, Inc: Hoboken, NJ, USA, 1991. [Google Scholar]
- Frieden, B.R. Physics from Fisher Information: A Unification; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
- Zamir, R. A Proof of the Fisher Information Inequality Via a Data Processing Argument. IEEE Trans. Inf. Theory 1998, 44, 1246–1250. [Google Scholar]
- Taubman, D.; Marcellin, M. JPEG2000: Image Compression Fundamentals, Standards, and Practice; Kluwer Academic Publishers: Dordrecht, The Netherlands, 2002. [Google Scholar]
- Hogg, R.V.; Craig, A.T. Introduction to Mathematical Statistics; Prentice Hall: Upper Saddle River, NJ, USA, 1995. [Google Scholar]
- Frieden, B.R. Probability, Statistical Optics, and Data Testing; Springer-Verlag: Berlin, Germany, 1991. [Google Scholar]
- Otto, F.; Villani, C. Generalization of an Inequality by Talagrand and Links with the Logarithmic Sobolev Inequality. J. Funct. Anal. 2000, 173, 361–400. [Google Scholar]
- Yáñez, R.J.; Sánchez-Moreno, P.; Zarzo, A.; Dehesa, J.S. Fisher information of special functions and second-order differential equations. J. Math. Phys. 2008, 49, 082104. [Google Scholar] [Green Version]
- Gianazza, U.; Savaré, G.; Toscani, G. The wasserstein gradient flow of the fisher information and the quantum drift-diffusion equation. Arch. Ration. Mech. Anal. 2009, 194, 133–220. [Google Scholar]
- Verdú, S. Mismatched Estimation and Relative Entropy. IEEE Trans. Inf. Theory 2010, 56, 3712–3720. [Google Scholar]
- Hirata, M.; Nemoto, A.; Yoshida, H. An integral representation of the relative entropy. Entropy 2012, 14, 1469–1477. [Google Scholar]
- Sánchez-Moreno, P.; Zarzo, A.; Dehesa, J.S. Jensen divergence based on Fisher’s information. J. Phys. A: Math. Theor. 2012, 45, 125305. [Google Scholar]
- Yamano, T. Phase space gradient of dissipated work and information: A role of relative Fisher information. J. Math. Phys. 2013, 54, 1–9. [Google Scholar]
- Yamano, T. De Bruijn-type identity for systems with flux. Eur. Phys. J. B 2013, 86, 363. [Google Scholar]
- Bobkov, S.G.; Chistyakov, G.P.; Gotze, F. Fisher information and the central limit theorem. Probab. Theory Relat. Fields. 2014, 159, 1–59. [Google Scholar]
- Zegers, P. Some New Results on The Architecture, Training Process, and Estimation Error Bounds for Learning Machines. Ph.D. Thesis, The University of Arizona, Tucson, AZ, USA, 2002. [Google Scholar]
- Kullback, S.; Leibler, R.A. On Information and Sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar]
- Lutwak, E.; Yang, D.; Zhang, G. Renyi entropy and generalized Fisher information. IEEE Trans. Inf. Theory 2005, 51, 473–478. [Google Scholar]
- Kagan, A.; Yu, T. Some Inequalities Related to the Stam Inequality. Appl. Math. 2008, 53, 195–205. [Google Scholar]
- Lutwak, E.; Lv, S.; Yang, D.; Zhang, G. Extensions of Fisher Information and Stam’s Inequality. IEEE Trans. Inf. Theory 2012, 58, 1319–1327. [Google Scholar]
- Bercher, J.F. On Generalized Cramér-Rao Inequalities, and an Extension of the Shannon-Fisher-Gauss Setting; Cornell University Library: Ithaca, NY, USA, 2014. [Google Scholar]
- Stein, M.; Mezghani, A.; Nossek, J.A. A Lower Bound for the Fisher Information Measure. IEEE Signal Process. Lett. 2014, 21, 796–799. [Google Scholar]
- Plastino, A.; Plastino, A. Symmetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time. Phys. Rev. E 1996, 54, 4423–4426. [Google Scholar]
© 2015 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zegers, P. Fisher Information Properties. Entropy 2015, 17, 4918-4939. https://doi.org/10.3390/e17074918
Zegers P. Fisher Information Properties. Entropy. 2015; 17(7):4918-4939. https://doi.org/10.3390/e17074918
Chicago/Turabian StyleZegers, Pablo. 2015. "Fisher Information Properties" Entropy 17, no. 7: 4918-4939. https://doi.org/10.3390/e17074918
APA StyleZegers, P. (2015). Fisher Information Properties. Entropy, 17(7), 4918-4939. https://doi.org/10.3390/e17074918