1 Institute of Applied Statistics and Linz Institute of Technology, Johannes Kepler University, Institute of Statistics University of Valparaíso, Altenberger Strasse 69 A–4040 Linz, Austria, Austria
2 Department of Applied Mathematics and Statistics Comenius University, SK–842 48 Bratislava, Mlynská Dolina, Slovakia
3 Department of Statistics and Operation Analysis Mendel University in Brno, Zemědělská 1 CZ–613 00, Brno, Czech Republic
4 Department of Probability and Mathematical Statistics Charles University, 83 CZ–186 75, Praha, Sokolovská
Institute of Applied Statistics and Linz Institute of Technology, Johannes Kepler University, Altenberger Strasse 69 A–4040 Linz, Austria, Institute of Statistics University of Valparaíso, Gran Bretana 1111 Valparaíso Chile, Austria
In this paper we give a partial response to one of the most important statistical questions, namely, what optimal statistical decisions are and how they are related to (statistical) information theory. We exemplify the necessity of understanding the structure of information divergences and their approximations, which may in particular be understood through deconvolution. Deconvolution of information divergences is illustrated in the exponential family of distributions, leading to the optimal tests in the Bahadur sense. We provide a new approximation of I-divergences using the Fourier transformation, saddle point approximation, and uniform convergence of the Euler polygons. Uniform approximation of deconvoluted parts of I-divergences is also discussed. Our approach is illustrated on a real data example.
Mathematica Slovaca, the oldest and best mathematical journal in Slovakia, was founded in 1951 at the Mathematical Institute of the Slovak Academy of Science, Bratislava. It covers practically all mathematical areas. As a respectful international mathematical journal, it publishes only highly nontrivial original articles with complete proofs by assuring a high quality reviewing process.