Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Statistical Applications in Genetics and Molecular Biology

Editor-in-Chief: Stumpf, Michael P.H.

6 Issues per year


IMPACT FACTOR 2016: 0.646
5-year IMPACT FACTOR: 1.191

CiteScore 2016: 0.94

SCImago Journal Rank (SJR) 2016: 0.625
Source Normalized Impact per Paper (SNIP) 2016: 0.596

Mathematical Citation Quotient (MCQ) 2016: 0.06

Online
ISSN
1544-6115
See all formats and pricing
More options …
Volume 14, Issue 3 (Jun 2015)

Issues

Volume 10 (2011)

Volume 9 (2010)

Volume 6 (2007)

Volume 5 (2006)

Volume 4 (2005)

Volume 2 (2003)

Volume 1 (2002)

A mutual information estimator with exponentially decaying bias

Zhiyi Zhang
  • Corresponding author
  • Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Lukun Zheng
  • Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USA
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2015-05-05 | DOI: https://doi.org/10.1515/sagmb-2014-0047

Abstract

A nonparametric estimator of mutual information is proposed and is shown to have asymptotic normality and efficiency, and a bias decaying exponentially in sample size. The asymptotic normality and the rapidly decaying bias together offer a viable inferential tool for assessing mutual information between two random elements on finite alphabets where the maximum likelihood estimator of mutual information greatly inflates the probability of type I error. The proposed estimator is illustrated by three examples in which the association between a pair of genes is assessed based on their expression levels. Several results of simulation study are also provided.

Keywords: asymptotic normality; bias; mle; mutual information; nonparametric estimator

AMS 2000 Subject Classifications: Primary 62F10; 62F12; 62G05; 62G20

References

  • Beirlant, J., E. J. Dudewicz, L. Györfi and E. C. Meulen (2001): Nonparametric entropy estimation: an overview,” Int. J. Math. Stat. Sci., 6, 17–39.Google Scholar

  • Casella, G. and R. L. Berger (2002): Statistical inference, 2nd edition. CA, USA: Duxbury, Pacific Grove.Google Scholar

  • Chao, A., Y. T. Wang and L. Jost (2013): “Entropy and the species accumulation curve: a novel entropy estimator via discovery rates of new species,” Method. Ecol. Evol., 4, 1091–1100.CrossrefWeb of ScienceGoogle Scholar

  • Cover, T. M. and J. A. Thomas (2006): Elements of information theory, 2nd edition. Hoboken: John Wiley & Son, Inc.Google Scholar

  • Dave, S. S., G. Wright, B. Tan, A. Rosenwald, R. D. Gascoyne, W. C. Chan, R. I. Fisher, R. M. Braziel, L. M. Rimsza, T. M. Grogan, T. P. Miller, M. LeBlanc, T. C. Greiner, D. D. Weisenburger, J. C. Lynch, J. Vose, J. O. Armitage, E. B. Smeland, S. Kvaloy, H. Holte, J. Delabie, J. M. Connors, P. M. Lansdorp, Q. Ouyang, T. A. Lister, A. J. Davies, A. J. Norton, H. K. Muller-Hermelink, G. Ott, E. Campo, E. Montserrat, W. H. Wilson, E. S. Jaffe, R. Simon, L. Yang, J. Powell, H. Zhao, N. Goldschmidt, M. Chiorazzi and L. M. Staudt (2004): Prediction of survival in follicular lymphoma based on molecular features of tumor-infiltrating immune cells,” New Engl. J. Med., 351(21), 2159–2169.Google Scholar

  • Efron, B. and C. Stein (1981): “The jackknife estimate of variance,” Ann. Stat., 9, 586–596.CrossrefGoogle Scholar

  • Grassberger, P. (2003): “Entropy estimates from insufficient samplings,” URL: www.arxiv.org. arXiv:physics/0307138v2. Updated 2008. Accessed on 29 July, 2003.

  • Harris, B. (1975): “The statistical estimation of entropy in the non-parametric case,” Topics Inf. Theory, 16, 323–355.Google Scholar

  • Kraskov, A., H. Stögbauer and P. Grassberger (2004): Estimating mutual information,” Phys. Rev. E 69, 066138.Google Scholar

  • Leonenko, N., L. Pronzato and V. Savani (2008): A class of Renyi information estimators for multidimensional densities,” Ann. Stat., 6, 2153–2182.Web of ScienceCrossrefGoogle Scholar

  • Miller, G. (1955) Note on the bias of information estimates. In: Quastler, H. (Ed.), Information theory in psychology: problems and methods, Glencoe, IL: Free Press, pp. 95–100.Google Scholar

  • Nemenman, I., F. Shafee and W. Bialek (2002): Entropy and inference, revisited. Advances in Neural Information Processing Systems 14, Cambridge, MA: MIT Press.Google Scholar

  • Paninski, L. (2003): “Estimation of entropy and mutual information,” Neural Comput., 15, 1191–1253.CrossrefGoogle Scholar

  • Schürmann, T. (2004): Bias analysis in entropy estimation,” J. Phys. A: Math. Gen., 37, 295–301.CrossrefGoogle Scholar

  • Shannon, C. E. (1948): “A mathematical theory of communication,” Bell Syst. Tech. J., 27, 379–423, 623–656.CrossrefGoogle Scholar

  • Zhang, Z. (2012): “Entropy estimation in turing’s perspective,” Neural Comput., 24(5), 1368–1389.Web of ScienceCrossrefGoogle Scholar

  • Zhang, Z. (2013): “Asymptotic normality of an entropy estimator with exponentially decaying bias,” IEEE Trans. Inform. Theory, 59(1), 504–508.Web of ScienceCrossrefGoogle Scholar

  • Zhang, Z. and M. Grabchak (2013): “Bias adjustment for a nonparametric entropy estimator,” Entropy, 15(6), 1999–2011.CrossrefWeb of ScienceGoogle Scholar

  • Zhang, Z. and X. Zhang (2012): “A normal law for the plug-in estimator of entropy,” IEEE Trans. Inform. Theory, 58(5), 2745–2747.Web of ScienceCrossrefGoogle Scholar

About the article

Corresponding author: Zhiyi Zhang, Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USA, e-mail:


Published Online: 2015-05-05

Published in Print: 2015-06-01


Citation Information: Statistical Applications in Genetics and Molecular Biology, ISSN (Online) 1544-6115, ISSN (Print) 2194-6302, DOI: https://doi.org/10.1515/sagmb-2014-0047.

Export Citation

©2015 by De Gruyter. Copyright Clearance Center

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Thalia E. Chan, Michael P.H. Stumpf, and Ann C. Babtie
Cell Systems, 2017, Volume 5, Number 3, Page 251

Comments (0)

Please log in or register to comment.
Log in