In this paper we have constructed the goodness-of-fit tests incorporating several components, like expectation and covariance function for identification of a non-centered univariate random sequence or auto-covariances and cross-covariances for identification of a centered multivariate random sequence. For the construction of the corresponding estimators and investigation of their properties we utilized the theory of square Gaussian random variables.
Investigating different processes and phenomena, we very often deal with random sequences or time series in practice. The later notion is used more frequently, but in this paper we will use the former one stressing out on some similarity with random processes.
There are many books and papers devoted to this topic. In particular, the classical books on statistical analysis of time series are written by Anderson , Box and Jenkins , and Brockwell and Davis . Many of the goodness-of-fit tests in time series analysis are residual-based. For example, the classic portmanteau test of Box and Pierce  and its improvement by Ljung and Box  are based on the sample autocorrelations of the residuals. Chen and Deo  proposed some diagnostic tests based on a spectral approach of the residuals.
Within the model-based approach to time series analysis, estimated residuals are computed once a fitted model has been obtained from the data, and then tested whether they behave like white noise. These tests require the computation of residuals from the fitted model, which can be quite tedious when the model does not have a finite order autoregressive representation. Also, in such cases, the residuals are not uniquely defined.
In this paper we use another approach gained from the theory of stochastic processes. It is well known that random sequences as long as processes can be identified through their expectation and covariance function. So, for checking a goodness-of-fit test for a non-centered univariate stationary Gaussian sequence we need to aggregate the information about both these components. And we do it with the help of quadratic forms. Another issue of our interest are centered multivariate random sequences. In this case we can also incorporate information on every component into the test through the quadratic forms.
For the estimator construction we utilize the theory of square Gaussian random variables. This theory was developed in the works [10, 12, 13] for the investigation of stochastic processes. In the book by Buldygin and Kozachenko  the properties of the space of square Gaussian random variables were studied.
So, at the beginning we investigate properties of quadratics forms of square Gaussian random variables raised to some power p and then construct the test. For these tests we do not need to compute the residuals and they can be applied even in the case of infinite order representations.
This paper is the continuation of a series of works. In the papers  and  we considered centered univariate random sequences and constructed the goodness-of-fit tests using different types of statistics: one was based on properties of maximum of square Gaussian random variables, another one was built using inequalities for the Square Gaussian random variables raised to the power p. The statistics utilizing maximum of square Gaussian random variables had been constructed for non-centered univariate random sequences and for multivariate but centered random sequences in .
The paper consists of five sections and two annexes. The second section is devoted to the theory of square Gaussian random variables and contains the main definitions and results. Sections 3 and 4 consist of application of the estimate obtained in Section 2 for construction different aggregated tests. The criterion in Section 3 is constructed for testing the aggregated hypothesis on expectation and covariance function of the non-centered stationary Gaussian sequence. In Section 4 we consider the centered Gaussian multivariate stationary sequences. The approach analyzing the residuals dominates in multivariate case too. See, for example, papers by Hosking [7, 8], Mahdi and McLeod  and the references therein. The goodness-of-fit test we have constructed is based on fitting the covariance function. Section 5 draws some conclusions. Some necessary mathematical calculations are relegated to the annexes at the end.
2 Square Gaussian random variables
Let be a family of the jointly Gaussian random variables for which for all .
Definition 2.1 ()
The space is the space of square Gaussian random variables if any element can be presented as
where , , , is a real-valued matrix; or the element is the square mean limit of the sequence of the form (1), that is,
It was proved by Buldygin and Kozachenko in  that is a linear space.
For the square Gaussian random variables the following results hold true.
Theorem 2.2 ()
Let be a random vector such that and let be a symmetric semi-definite matrix. Then for all the following inequality is true:
where , .
Let be a vector such that and let be a symmetric and semi-definite matrix. Then for any and the following inequality holds true:
It can be easily seen that for all and we have , or equivalently, . Hence, for all ,
Here the function R comes from Theorem 2.2. ∎
Let , , be a sequence of random variables that can be presented as a quadratic form of square Gaussian random variables, that is,
where with , and let be a symmetric semi-definite matrix. Then
for all and , where , .
Let us consider , and . Applying the Chebyshev’s inequality, we obtain that
The Minkowski inequality and (3) imply that
where , , .
Let us investigate the behavior of the function when . It reaches its minimum value at the point and
But since the Minkowski inequality is valid only for , we should be sure that . This will be true if .
So, we can minimize the right-hand side of inequality (5) with regard to r assuming and obtain that
Denoting with , we get
The function within reaches its minimum at the points . We will only consider since this function is even. As long as for all , we have . The function f attains its minimum value at the point and it is equal to
Inequality (4) holds true for δ satisfying the relation
The last condition is fulfilled as . ∎
3 Goodness-of-fit test for non-centered univariate sequences
The results obtained in the previous section can be utilized for construction the tests on joint hypothesis about an expectation and a covariance function of the non-centered univariate stationary Gaussian sequence.
Let us consider the stationary sequence for which is its expectation and
is its covariance function. Hereinafter we will consider stationarity in a strict sense.
We assume that we have () consecutive observations of this random sequence. Let us consider the estimators for the expectation and covariance function as follows:
For every estimator above we can evaluate the quantities
So, the following random variables are square Gaussian:
Let us define the vectors
For any semi-definite matrix the random variables
are actually the quadratic forms of square Gaussian random variables.
(that is, is the identity matrix of order 2). Then
All necessary formulas for the terms of and their derivations are included in Annex 1.
Consider the stationary sequence with and covariance function
where , and
In our case when the random variables
Let us use the following notation:
Denoting for simplicity
we will get
This implies the statement of the theorem. ∎
The desired property for is as . It is fulfilled for many covariance functions, for instance, for , , .
Using inequality (12), we can construct the goodness-of-fit test.
Let the null hypothesis state that for non-centered Gaussian stationary sequence the expectation is and its covariance function is given by , . The alternative implies the opposite statement. The random variables are as determined in (6)–(11) with . If for significance level α, some fixed and , with ,
then the hypothesis should be rejected and accepted otherwise. Here is a critical value that can be found from the equation
and taking into account the restriction .
The criterion follows from Theorem 3.3. ∎
The formulas for the evaluation of can be found in Annex 1.
4 Goodness-of-fit tests for the centered multivariate random sequences
Inequality (4) can also be useful for testing a hypotheses on covariance of centered multivariate random sequences.
Let us assume that the components of the multivariate random sequence , , are jointly Gaussian, stationary (in the strict sense) sequences for which and
is the covariance function of this multivariate sequence. It is worth to mention that if , then is an ordinary autocovariance function of the k-th component and when , are the joint covariances or sometimes called cross-covariances. Hereinafter we shell use for the term covariance function of the sequence.
We suppose that the sequence is observed at the points (). As an estimator of the covariance we choose
The estimator is unbiased for :
The random variables
are square Gaussian since can be presented as
where the matrix is
Let be a vector with components and let be some semi-definite matrices. In this case we can construct the goodness-of-fit test for centered multi-variate random sequence using the results of Theorem 2.4.
Let the null hypothesis state that for the centered Gaussian stationary multivariate sequence , its covariance function equals , , while the alternative states the opposite. If for some fixed , , with , some significance level α and corresponding critical value (), which can be found from the equation
for any semi-definite matrices and random variables ,
then the hypothesis should be rejected and accepted otherwise.
The criterion follows from Theorem 2.4. ∎
The probability of type I error for Criterion 2 is less than or equal to α.
The simplest way is to choose the matrix to be identical of the corresponding order.
Let us consider the 2-component () stationary centered Gaussian sequence
If we choose the matrix to be an identical one of fourth order, then and
The needed formulas for evaluation of are included in Annex 2.
In this paper we estimated the distribution of quadratic forms raised to the power p of square Gaussian random variables. This result made it possible to build the criterion for testing a hypothesis on expectation and covariance function of the non-centered univariate stationary Gaussian sequence and a hypothesis on the covariance function of the centered multivariate stationary Gaussian sequence.
Our test statistics are quite easy to compute and do not require the calculation of residuals from the fitted model. This is especially advantageous when the fitted model is not a finite order autoregressive model.
There is, of course, a lot of room for improvement of the tests. Comparison with other tests and finding the number N for which the null and simple alternative hypotheses can be distinguishable are also very important issues for further investigation.
This annex includes the calculations needed in Section 3. In particular, the general formulas are given for and .
Using Isserlis’ formula for centered Gaussian random variables , we obtain
Using again Isserlis’ formula, we obtain that
Using Isserlis’ formula for the centered Gaussian random variables, we obtain
Funding statement: The third author’s research was supported by the Visegrad Scholarship Program-EaP (No. 51601704).
 Anderson T. W., The Statistical Analysis of Time Series, John Wiley & Sons, New York, 1971. Search in Google Scholar
 Box G. E. P., Jenkins G. M. and Reinsel G. C., Time Series Analysis: Forecasting and Control, 4th ed., John Wiley & Sons, Hoboken, 2011. Search in Google Scholar
 Box G. E. P. and Pierce D. A., Distribution of the residual autocorrelations in autoregressive integrated moving average time series models, J. Amer. Statist. Assoc. 65 (1970), 1509–1526. 10.1080/01621459.1970.10481180Search in Google Scholar
 Brockwell P. J. and Davis R. A., Time Series: Theory and Methods, 2nd ed., Springer Ser. Statist., Springer, New York, 2009. Search in Google Scholar
 Chen W. W. and Deo R. S., A generalized portmanteau goodness-of-git test for time series models, Econometric Theory 20 (2004), no. 2, 382–416. Search in Google Scholar
 Hosking J. R. M., Lagrange-multiplier tests of multivariate time-series models, J. R. Stat. Soc. Ser. B. Stat. Methodol. 43 (1981), no. 2, 219–230. 10.1111/j.2517-6161.1981.tb01174.xSearch in Google Scholar
 Ianevych T. O., An -criterion for testing a hypothesis about the covariance function of a rancom sequence, Theory Probab. Math. Statist. 92 (2016), 163–173. 10.1090/tpms/990Search in Google Scholar
 Kozachenko Y. V. and Fedoryanych T. V., A criterion for testing hypotheses about the covariance function of a Gaussian stationary process, Theory Probab. Math. Statist. 69 (2004), 85–94. 10.1090/S0094-9000-05-00616-2Search in Google Scholar
 Kozachenko Y. V. and Stadnik A. I., Pre-Gaussian processes and convergence in of estimators of covariance function, Theory Probab. Math. Statist. 45 (1991), 51–57. Search in Google Scholar
 Kozachenko Y. V. and Stus O. V., Square-Gaussian random processes and estimators of covariance functions, Math. Commun. 3 (1998), no. 1, 83–94. Search in Google Scholar
 Kozachenko Y. V. and Yakovenko T. O., Criterion for testing the hypotethis about the covariance function of the stationary Gaussian random sequence (in Ukrainian), Bull. Uzhgorod Univ. Ser. Math. Inform. 20 (2010), 39–43. Search in Google Scholar
© 2017 by De Gruyter