Abstract
In this paper, we propose the problem of estimating a regression function recursively based on the minimization of the Mean Squared Relative Error (MSRE), where outlier data are present and the response variable of the model is positive. We construct an alternative estimation of the regression function using a stochastic approximation method. The Bias, variance, and Mean Integrated Squared Error (MISE) are computed explicitly. The asymptotic normality of the proposed estimator is also proved. Moreover, we conduct a simulation to compare the performance of our proposed estimators with that of the two classical kernel regression estimators and then through a real Malaria dataset.
References
[1] Altendji, B., J. Demongeot, A. Laksaci, and M. Rachdi (2018). Functional data analysis: estimation of the relative error in functional regression under random left-truncation model. J. Nonparametr. Stat. 30(2), 472–490.10.1080/10485252.2018.1438609Search in Google Scholar
[2] Attouch, M., A. Laksaci, and N. Messabihi (2017). Nonparametric relative error regression for spatial random variables. Statist. Papers 58(4), 987–100810.1007/s00362-015-0735-6Search in Google Scholar
[3] Bojanic, R. and E. Seneta (1973). A unified theory of regularly varying sequences. Math. Z. 134, 91–106.10.1007/BF01214468Search in Google Scholar
[4] Chatfield, C (2007). The joys of consulting. Significance 4(1), 33–36.10.1111/j.1740-9713.2007.00219.xSearch in Google Scholar
[5] Chen, K., S. Guo, Y. Lin, and Z. Ying (2010). Least absolute relative error estimation. J. Amer. Stat. Assoc. 105(491), 1104–1112.10.1198/jasa.2010.tm09307Search in Google Scholar PubMed PubMed Central
[6] Cleveland, W. S. and S. J. Devlin (1988). Locally weighted regression: an approach to regression analysis by local fitting. J. Amer. Statist. Assoc. 83(403), 596–610.10.1080/01621459.1988.10478639Search in Google Scholar
[7] Demongeot, J., A. Hamie, A. Laksaci, and M. Rachdi. (2016), Relative-error prediction in nonparametric functional statistics: theory and practice. J. Multivariate Anal. 146, 261–268.10.1016/j.jmva.2015.09.019Search in Google Scholar
[8] Derrar, S., A. Laksaci, and E. Ould Saïd (2020). M-estimation of the regression function under random left truncation and functional time series model. Statist. Papers 61(3), 1181–1202.10.1007/s00362-018-0979-zSearch in Google Scholar
[9] Duflo, M. (1997) Random Iterative Models. Springer, Berlin.Search in Google Scholar
[10] Eubank, R. L. (1988). Spline Smoothing and Nonparametric Regression. Dekker, New York.Search in Google Scholar
[11] Farnum, N. R. (1990). Improving the relative error of estimation. Amer. Stat., 44(4), 288–289.Search in Google Scholar
[12] Green, P. J. and B. W. Silverman (1994). Nonparametric Regression and Generalized Linear Models. Chapman & Hall, London.10.1007/978-1-4899-4473-3Search in Google Scholar
[13] Galambos, J., and E. Seneta (1973). Regularly varying sequences. Proc. Amer. Math. Soc. 41(1), 110–116.10.1090/S0002-9939-1973-0323963-5Search in Google Scholar
[14] Herrmann, E., M. Wand, J. Engel and T. Gasser (1995). A bandwidth selector for bivariate kernel regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 57(1), 171–180.10.1111/j.2517-6161.1995.tb02022.xSearch in Google Scholar
[15] Jones, M. C., H. Park, K. I. Shin, S. K. Vines and S. H. Jeong (2008). Relative error prediction via kernel regression smoothers. J. Statist. Plann. Inference 138(10), 2887–2898.10.1016/j.jspi.2007.11.001Search in Google Scholar
[16] Khardani, S. and Y. Slaoui (2019a). Nonparametric relative regression under random censorship model. Statist. Probab. Lett. 151, 116–122.10.1016/j.spl.2019.03.019Search in Google Scholar
[17] Khardani, S. and Y. Slaoui (2019b). Recursive kernel density estimation and optimal bandwidth selection under α: Mixing Data. J. Stat. Theory Pract. 13(2), 1–2110.1007/s42519-018-0031-6Search in Google Scholar
[18] Khoshgoftaar, T. M., B. B. Bhattacharyya, G. D. Richardson (1992). Predicting software errors, during development, using nonlinear regression models: a comparative study. IEEE Trans. Reliab. 41(3), 390–395.10.1109/24.159804Search in Google Scholar
[19] Kushner, H. J. (1977). General convergence results for stochastic approximations via week convergence theory. J. Math. Anal. Appl. 61(2), 490–503.10.1016/0022-247X(77)90133-0Search in Google Scholar
[20] Kushner, H. J. and G. G. Yin (2003). Stochastic Approximation and Recursive Algorithms and Applications. Second edition. Springer, New York.Search in Google Scholar
[21] Ljung, L. (1978). Strong convergence of a stochastic approximation algorithm. Ann. Statist. 6(3), 680–696.10.1214/aos/1176344212Search in Google Scholar
[22] Mechab, W. and A. Laksaci (2016). Nonparametric relative regression for associated random variables. Metron 74(1), 75–97.10.1007/s40300-016-0084-9Search in Google Scholar
[23] Milet, J., G. Nuel, L. Watier, D. Courtin, Y. Slaoui, P. Senghor, F. Migot-Nabias, O. Gaye and A. Garcia (2010). Genome wide linkage study, using a 250K SNP map, of Plasmodium falciparum infection and mild malaria attack in a Senegalese population. PLoS ONE 5(7), Article ID e11616, 11 pages.10.1371/journal.pone.0011616Search in Google Scholar PubMed PubMed Central
[24] Mokkadem, A., and M. Pelletier (2007). A companion for the Kiefer-Wolfowitz-Blum stochastic approximation algorithm. Ann. Statist. 35(4), 1749–1772.10.1214/009053606000001451Search in Google Scholar
[25] Mokkadem, A., M. Pelletier, and Y. Slaoui (2009a). The stochastic approximation method for the estimation of a multivariate probability density. J. Statist. Plann. Inference 139(7), 2459–2478.10.1016/j.jspi.2008.11.012Search in Google Scholar
[26] Mokkadem, A., M. Pelletier, and Y. Slaoui (2009b). Revisiting Révész’s stochastic approximation method for the estimation of a regression function. ALEA. Lat. Am. J. Probab. Math. Stat. 6, 63–114.Search in Google Scholar
[27] Nadaraya, E. A. (1964). On estimating regression. Theory Probab. Appl. 9(1), 141–142.10.1137/1109020Search in Google Scholar
[28] Narula, S.C. and J.F. Wellington (1977). Prediction, linear regression and the minimum sum of relative errors. Technometrics 19(2), 185–190.10.1080/00401706.1977.10489526Search in Google Scholar
[29] Park, H and L. Stefanski (1998). Relative-error prediction. Statist. Probab. Lett. 40(3), 227–236.10.1016/S0167-7152(98)00088-1Search in Google Scholar
[30] Révész, P. (1973). Robbins-Monro procedure in a Hilbert space and its application in the theory of learning processes I. Studia Sci. Math. Hungar. 8, 391–398.Search in Google Scholar
[31] Révész, P. (1977). How to apply the method of stochastic approximation in the non-parametric estimation of a regression function. Math. Operationsforsch. Statist. Ser. Statist. 8(1), 119–126.Search in Google Scholar
[32] Robbins, H. and S. Monro (1951). A Stochastic Approximation Method. Ann. Math. Statist. 22(3), 400–407.10.1214/aoms/1177729586Search in Google Scholar
[33] Ruppert, D. (1982). Almost sure approximations to the Robbins-Monro and Kiefer-Wolfowitz processes with dependent noise. Ann. Probab. 10(1), 178–187.10.1214/aop/1176993921Search in Google Scholar
[34] Ruppert, D. and M. P. Wand (1994). Multivariate locally weighted least squares regression. Ann. Statist. 22(3), 1346–1370.10.1214/aos/1176325632Search in Google Scholar
[35] Silverman, B. W. (1986). Density Estimation for Statistics and Data Analysis. Chapman & Hall, London.Search in Google Scholar
[36] Slaoui, Y. (2014a). Bandwidth selection for recursive kernel density estimators defined by stochastic approximation method. J. Probab. Stat. 2014, Article ID 739640, 11 pages.10.1155/2014/739640Search in Google Scholar
[37] Slaoui, Y. (2014b). The stochastic approximation method for estimation of a distribution function. Math. Methods Statist. 23(4), 306–325.10.3103/S1066530714040048Search in Google Scholar
[38] Slaoui, Y. and G. Nuel (2014c). Parameter estimation in a hierarchical random intercept model with censored response: an approach using a SEM algorithm and Gibbs sampling. Sankhya B 76(2), 210–233.10.1007/s13571-014-0081-zSearch in Google Scholar
[39] Slaoui, Y. (2015). Plug-in Bandwidth selector for recursive kernel regression estimators defined by stochastic approximation method. Stat. Neerl. 69(4), 483–509.10.1111/stan.12069Search in Google Scholar
[40] Slaoui, Y. (2016). Optimal bandwidth selection for semi-recursive kernel regression estimators. Stat. Interface. 9(3), 375– 388.10.4310/SII.2016.v9.n3.a11Search in Google Scholar
[41] Slaoui, Y. (2017). Recursive kernel density estimators under missing data. Comm. Statist. Theory Methods 46(18), 9101– 9125.10.1080/03610926.2016.1205618Search in Google Scholar
[42] Slaoui, Y. (2018). Bias reduction in kernel density estimation. J. Nonparametr. Stat. 30(2), 505–522.10.1080/10485252.2018.1442927Search in Google Scholar
[43] Slaoui, Y. (2019). Wild Bootstrap Bandwidth Selection of Recursive Nonparametric Relative Regression for Independent Functional Data. J. Multivariate Anal. 173, 494–511.10.1016/j.jmva.2019.04.009Search in Google Scholar
[44] Slaoui, Y. (2020). Two new nonparametric kernel distribution estimators based on a transformation of the data. J. Appl. Stat., to appear. Available at https://doi.org/10.1080/02664763.2020.1786675.10.1080/02664763.2020.1786675Search in Google Scholar
[45] Stone, C. J. (1980). Optimal rates of convergence for nonparametric estimators. Ann. Statist. 8(6), 1348–1360.10.1214/aos/1176345206Search in Google Scholar
[46] Stone, C. J. (1982). Optimal global rates of convergence for nonparametric regression. Ann. Statist. 10(4), 1040–1053.10.1214/aos/1176345969Search in Google Scholar
[47] Tsybakov, A. B. (1990). Recurrent estimation of the mode of a multidimensional distribution. Probl. Inf. Transm. 26(1), 31–37.Search in Google Scholar
[48] Wahba, G. (1990). Spline Models for Observational Data. Society for Industrial and Applied Mathematics, Philadelphia.10.1137/1.9781611970128Search in Google Scholar
[49] Wand, M. P. and M. C. Jones. (1994). Multivariate plug-in bandwidth selection. Comput. Stat. 9(2), 97–116.Search in Google Scholar
[50] Watson, G. S. (1964). Smooth regression analysis. Sankhya A 26(4), 359–372.Search in Google Scholar
[51] Yang, Y. and F. Ye. (2013). General relative error criterion and M-estimation. Front. Math. China 8(3), 695–715.10.1007/s11464-013-0286-xSearch in Google Scholar
© 2020 Yousri Slaoui et al., published by De Gruyter
This work is licensed under the Creative Commons Attribution 4.0 International License.