Abstract
We are interested in a class of seasonal autoregressive moving average (SARMA) models with periodically varying parameters, so-called seasonal periodic autoregressive moving average (SPARMA) models under the assumption that the errors are uncorrelated but non-independent (i.e. weak SPARMA models). Relaxing the classical independence assumption on the errors considerably extends the range of application of the SPARMA models, and allows one to cover linear representations of general nonlinear processes. We establish the asymptotic properties of the quasi-generalized least squares (QLS) estimator of these models. Particular attention is given to the estimation of the asymptotic variance matrix of the QLS estimator, which may be very different from that obtained in the standard framework. A set of Monte Carlo experiments are presented.
Acknowledgments
We sincerely thank the anonymous referee and Editor for helpful remarks.
Appendix A: Proofs
The proofs of Theorem 1 and 2 are quite technical. These are adaptations of the arguments used in Francq, Roy, and Saidi (2011). In order to shorten our paper and to point out the novelties of our proofs, only the new arguments will be developed. Let K and ρ be generic constants, whose values will be modified along the proofs, such that K > 0 and
A.1 Reminder on Technical Issues on WLS Method for SPARMA Models
We recall that, given a realization X 1, …, X NT , the periodic noise ϵ nT+ν (α) is approximated by e nT+ν (α) which is defined in (8).
The starting point in the asymptotic analysis, is the property that ϵ nT+ν (α) − e nT+ν (α) converges uniformly to 0 (almost-surely) as n goes to infinity. Similar properties also holds for the derivatives with respect to α of ϵ nT+ν (α) − e nT+ν (α). We sum up the fact that we shall need in the sequel. We refer to the appendix of Francq, Roy, and Saidi (2011) for a more detailed treatment.
Under Assumption (A1), for any α ∈ Δ
δ
and any
A useful property of the matrices sequences C, is that they are asymptotically exponentially small. Indeed, there exists
where K is a positive constant. Finally, from the above estimates, we are able to deduce that for any
for any constant
This implies that the sequences
We also have
by using the fact that
for any ɛ > 0 and N large enough. And that
A.2 Proof of Theorem 3
The proof of Theorem 3 is based on series lemmas. We use the multiplicative matrix norm defined by:
with a
i,j
the entries of
where
We the obtain
It suffices to show that
In view of Assumptions of Theorem 3 we have
To obtain the convergence in probability of
By (17)) we obtain
and
Consequently the least squares estimator of
where
Similar arguments combined with (15) yied
By (34) we obtain
In view of Assumptions of Theorem 3 and (30) we obtain
and we also have
Lemma 1
Under the assumptions of the Theorem 3 we have
Proof
See Lemma 1 in the supplementary material of Boubacar Mainassara, Carbon, and Francq (2012). □
Lemma 2
Under the assumptions of the Theorem 3 there exists a finite positive constant K such that, for 1 ≤ r 1, r 2 ≤ r and 1 ≤ m 1, m 2 ≤ T(p + q + P + Q) we have
Proof
We can take the supremum over the integers n > 0, and write the proof in the case m 1 = m 2 = m. We have
where
A slight extension of (Francq and Zakoïan 2019, Corollary A.3) concludes. □
Lemma 3
Under the assumptions of the Theorem 3,
Proof
For 1 ≤ m
1, m
2 ≤ T(p + q + P + Q) and 1 ≤ r
1, r
2 ≤ r, the (
For all β > 0 we use (30) and we obtain
The stationarity of the process
Finaly we have
when
We show in the following lemma that the previous lemma remains valid when we replace ϒ
n
(α
0) by
Lemma 4
Under the assumptions of the
Theorem 3,
Proof
We denote
By Lemma 3 the term
when
Step 1: proof of (37) .
For all β > 0 we have
where
It is follow that
where
Now notice that
We replace the above identity in (39) and we obtain by Hölder’s inequality that
where
We have
Then we obtain
The same calculations hold for the terms D N,2, D N,3, and D N,4. Thus
and reporting this estimation in (40) and using (29), we have
Then the sequence
Step 2: proof of (38) .
First we follow the same approach than in the previous step. We have
Since
one has
where
Taylor expansions around α
0 yield that there exists
and
with
and that
The same arguments are valid for E
N,2, E
N,3 and E
N,4. Consequently
when
Proof
Racall that (15) and (33) we have
By the orthogonality conditions in (15) and (33), one has
and consequently
Using Lemmas 1, 2, and (44) implies that
By the assumptions of Theorem 3,
Lemma 6
Under the assumptions of the Theorem 3 we have
as N → ∞ when
Proof
We have
and by induction we obtain
We have
Lemma 7
Under the assumptions of the Theorem 3, we have
Proof
By (33), we have
and so we have
The lemma is proved □
Proof of Theorem 3
Appendix B: Verification of Assumption (A1) on Example
Example 1
We now examine the causality and stationarity conditions on the following SPARMA(1, 1)(1, 1) model:
This SPARMA(1, 1)(1, 1) model admits the following PARMA(1 + T, 1 + T) representation:
Finally we deduce the following VARMA(2, 2) model:
where the matrices A 0, A 1, A 2, B 0, B 1, and B 2 are given for simplify for T = 4 by
we have
We have
References
Aknouche, A., and A. Bibi. 2009. “Quasi-maximum Likelihood Estimation of Periodic Garch and Periodic Arma-Garch Processes.” Journal of Time Series Analysis 30 (1): 19–46. https://doi.org/10.1111/j.1467-9892.2008.00598.x.Search in Google Scholar
Akutowicz, E. J. 1958. On an Explicit Formula in Linear Least Squares Prediction, 261–6. Mathematica Scandinavica.10.7146/math.scand.a-10503Search in Google Scholar
Andrews, D. W. K. 1991. “Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation.” Econometrica 59 (3): 817–58. https://doi.org/10.2307/2938229.Search in Google Scholar
Basawa, I. V., and R. Lund. 2001. “Large Sample Properties of Parameter Estimates for Periodic ARMA Models.” Journal of Time Series Analysis 22 (6): 651–63. https://doi.org/10.1111/1467-9892.00246.Search in Google Scholar
Basawa, I. V., R. Lund, and Q. Shao. 2004. “First-order Seasonal Autoregressive Processes with Periodically Varying Parameters.” Statistics & Probability Letters 67 (4): 299–306. https://doi.org/10.1016/j.spl.2004.02.001.Search in Google Scholar
Battaglia, F., D. Cucina, and M. Rizzo. 2018. “A Generalization of Periodic Autoregressive Models for Seasonal Time Series.” In Technical Report, Tech. Rep., Vol 2. Department of Statistical Sciences, University La Sapienza.Search in Google Scholar
Battaglia, F., D. Cucina, and M. Rizzo. 2020. “Parsimonious Periodic Autoregressive Models for Time Series with Evolving Trend and Seasonality.” Statistics and Computing 30 (1): 77–91. https://doi.org/10.1007/s11222-019-09866-0.Search in Google Scholar
Berk, K. N. 1974. “Consistent Autoregressive Spectral Estimates.” Annals of Statistics 2: 489–502. https://doi.org/10.1214/aos/1176342709.Search in Google Scholar
Bollerslev, T., and E. Ghysels. 1996. “Periodic Autoregressive Conditional Heteroscedasticity.” Journal of Business & Economic Statistics 14 (2): 139–51. https://doi.org/10.1080/07350015.1996.10524640.Search in Google Scholar
Boubacar Mainassara, Y. 2011. “Multivariate Portmanteau Test for Structural VARMA Models with Uncorrelated but Non-independent Error Terms.” Journal of Statistical Planning and Inference 141 (8): 2961–75. https://doi.org/10.1016/j.jspi.2011.03.022.Search in Google Scholar
Boubacar Maïnassara, Y. 2012. “Selection of Weak VARMA Models by Modified Akaike’s Information Criteria.” Journal of Time Series Analysis 33 (1): 121–30. https://doi.org/10.1111/j.1467-9892.2011.00746.x.Search in Google Scholar
Boubacar Mainassara, Y., and C. Francq. 2011. “Estimating Structural VARMA Models with Uncorrelated but Non-independent Error Terms.” Journal of Multivariate Analysis 102 (3): 496–505. https://doi.org/10.1016/j.jmva.2010.10.009.Search in Google Scholar
Boubacar Mainassara, Y., M. Carbon, and C. Francq. 2012. “Computing and Estimating Information Matrices of Weak ARMA Models.” Computational Statistics & Data Analysis 56 (2): 345–61. https://doi.org/10.1016/j.csda.2011.07.006.Search in Google Scholar
Boubacar Maïnassara, Y., and C. C. Kokonendji. 2016. “Modified Schwarz and Hannan-Quinn Information Criteria for Weak VARMA Models.” Statistical Inference for Stochastic Processes 19 (2): 199–217. https://doi.org/10.1007/s11203-015-9123-z.Search in Google Scholar
Boubacar Maïnassara, Y., and B. Saussereau. 2018. “Diagnostic Checking in Multivariate ARMA Models with Dependent Errors Using Normalized Residual Autocorrelations.” Journal of the American Statistical Association 113 (524): 1813–27. https://doi.org/10.1080/01621459.2017.1380030.Search in Google Scholar
Brockwell, P. J., and R. A. Davis. 1991. Time Series: Theory and Methods In Springer Series in Statistics, 2nd ed. New York: Springer-Verlag.10.1007/978-1-4419-0320-4Search in Google Scholar
den Haan, W. J., and A. T. Levin. 1997. “A Practitioner’s Guide to Robust Covariance Matrix Estimation.” In Robust Inference, Volume 15 of Handbook of Statist., 299–342. Amsterdam: North-Holland.10.1016/S0169-7161(97)15014-3Search in Google Scholar
Dufour, J.-M., and D. Pelletier. 2021. “Practical Methods for Modeling Weak Varma Processes: Identification, Estimation and Specification with a Macroeconomic Application.” Journal of Business & Economic Statistics 0 (0): 1–13. https://doi.org/10.1080/07350015.2021.1904960.Search in Google Scholar
Francq, C., R. Roy, and A. Saidi. 2011. “Asymptotic Properties of Weighted Least Squares Estimation in Weak PARMA Models.” Journal of Time Series Analysis 32 (6): 699–723. https://doi.org/10.1111/j.1467-9892.2011.00728.x.Search in Google Scholar
Francq, C., and J.-M. Zakoïan. 2007. “HAC Estimation and Strong Linearity Testing in Weak ARMA Models.” Journal of Multivariate Analysis 98 (1): 114–44. https://doi.org/10.1016/j.jmva.2006.02.003.Search in Google Scholar
Francq, C., and J.-M. Zakoïan. 2019. GARCH Models: Structure, Statistical Inference and Financial Applications. New York: Wiley.10.1002/9781119313472Search in Google Scholar
Giovanis, E. 2014. “The Turn-Of-The-Month-Effect: Evidence from Periodic Generalized Autoregressive Conditional Heteroskedasticity (PGARCH) Model.” International Journal of Economic Sciences and Applied Research 7 (3): 43–61.10.2139/ssrn.2479295Search in Google Scholar
Hipel, K., and A. I. McLeod. 1994. Time Series Modelling of Water Resources and Environmental Systems. Amsterdam: Elsevier.Search in Google Scholar
Jones, R. H., and W. M. Brelsford. 1967. “Time Series with Periodic Structure.” Biometrika 54 (3–4): 403–8. https://doi.org/10.1093/biomet/54.3-4.403.Search in Google Scholar
Katayama, N. 2012. “Chi-squared Portmanteau Tests for Structural VARMA Models with Uncorrelated Errors.” Journal of Time Series Analysis 33 (6): 863–72. https://doi.org/10.1111/j.1467-9892.2012.00799.x.Search in Google Scholar
Lund, R., and I. V. Basawa. 2000. “Recursive Prediction and Likelihood Evaluation for Periodic ARMA Models.” Journal of Time Series Analysis 21 (1): 75–93. https://doi.org/10.1111/1467-9892.00174.Search in Google Scholar
Lütkepohl, H. 2005. New Introduction to Multiple Time Series Analysis. Berlin: Springer-Verlag.10.1007/978-3-540-27752-1Search in Google Scholar
Morgan, J., and J. Tatar. 1972. “Calculation of the Residual Sum of Squares for All Possible Regressions.” Technometrics 14 (2): 317–25. https://doi.org/10.1080/00401706.1972.10488918.Search in Google Scholar
Newey, W. K., and K. D. West. 1987. “A Simple, Positive Semidefinite, Heteroskedasticity and Autocorrelation Consistent Covariance Matrix.” Econometrica 55 (3): 703–8. https://doi.org/10.2307/1913610.Search in Google Scholar
Noakes, D. J., A. I. McLeod, and K. W. Hipel. 1985. “Forecasting Monthly Riverflow Time Series.” International Journal of Forecasting 1 (2): 179–90. https://doi.org/10.1016/0169-2070(85)90022-6.Search in Google Scholar
Pagano, M. 1978. “On Periodic and Multiple Autoregressions.” Annals of Statistics 6 (6): 1310–7. https://doi.org/10.1214/aos/1176344376.Search in Google Scholar
Reinsel, G. C. 1997. Elements of Multivariate Time Series Analysis In Springer Series in Statistics, 2nd ed. New York: Springer-Verlag.10.1007/978-1-4612-0679-8Search in Google Scholar
Salas, J. D. 1980. Applied Modeling of Hydrologic Time Series. Water Resources Publication.10.1016/0309-1708(80)90028-7Search in Google Scholar
Salas, J. D., D. C. Boes, and R. A. Smith. 1982. “Estimation of Arma Models with Seasonal Parameters.” Water Resources Research 18 (4): 1006–10. https://doi.org/10.1029/wr018i004p01006.Search in Google Scholar
Thompstone, R. M., K. W. Hipel, and A. I. McLeod. 1985. “Grouping of Periodic Autoregressive Models.” Time Series Analysis: Theory and Practice 6: 35–49.Search in Google Scholar
Ursu, E., and P. Duchesne. 2009. “Estimation and Model Adequacy Checking for Multivariate Seasonal Autoregressive Time Series Models with Periodically Varying Parameters.” Statistica Neerlandica 63 (2): 183–212. https://doi.org/10.1111/j.1467-9574.2009.00417.x.Search in Google Scholar
Ursu, E., and J.-C. Pereau. 2016. “Application of Periodic Autoregressive Process to the Modeling of the Garonne River Flows.” Stochastic Environmental Research and Risk Assessment 30 (7): 1785–95. https://doi.org/10.1007/s00477-015-1193-3.Search in Google Scholar
Vecchia, A. 1985a. “Maximum Likelihood Estimation for Periodic Autoregressive Moving Average Models.” Technometrics 27 (4): 375–84. https://doi.org/10.1080/00401706.1985.10488076.Search in Google Scholar
Vecchia, A. 1985b. “Periodic Autoregressive-Moving Average (PARMA) Modeling with Applications to Water Resources 1.” JAWRA Journal of the American Water Resources Association 21 (5): 721–30. https://doi.org/10.1111/j.1752-1688.1985.tb00167.x.Search in Google Scholar
Vecchia, A. V. 1985c. “Periodic Autoregressive-Moving Average Modeling with Applications to Water Resources.” Journal of the American Water Resources Association 21 (5): 721–30. https://doi.org/10.1111/j.1752-1688.1985.tb00167.x.Search in Google Scholar
© 2022 Walter de Gruyter GmbH, Berlin/Boston