# Abstract

In this paper we have constructed the goodness-of-fit tests incorporating several components, like expectation and covariance function for identification of a non-centered univariate random sequence or auto-covariances and cross-covariances for identification of a centered multivariate random sequence. For the construction of the corresponding estimators and investigation of their properties we utilized the theory of square Gaussian random variables.

## 1 Introduction

Investigating different processes and phenomena, we very often deal with random sequences or time series in practice. The later notion is used more frequently, but in this paper we will use the former one stressing out on some similarity with random processes.

There are many books and papers devoted to this topic. In particular, the classical books on statistical analysis of time series are written by Anderson [1], Box and Jenkins [2], and Brockwell and Davis [4]. Many of the goodness-of-fit tests in time series analysis are residual-based. For example, the classic portmanteau test of Box and Pierce [3] and its improvement by Ljung and Box [15] are based on the sample autocorrelations of the residuals. Chen and Deo [6] proposed some diagnostic tests based on a spectral approach of the residuals.

Within the model-based approach to time series analysis, estimated residuals are computed once a fitted model has been obtained from the data, and then tested whether they behave like white noise. These tests require the computation of residuals from the fitted model, which can be quite tedious when the model does not have a finite order autoregressive representation. Also, in such cases, the residuals are not uniquely defined.

In this paper we use another approach gained from the
theory of stochastic processes. It is well known that random sequences as long as processes can be identified through their
expectation and covariance function. So, for checking a goodness-of-fit test for a
*non*-centered univariate stationary Gaussian sequence we need to aggregate
the information about both these components. And we do it with the help of quadratic forms.
Another issue of our interest are centered *multivariate* random sequences.
In this case we can also incorporate information on every component into the test through the quadratic forms.

For the estimator construction we utilize the theory of square Gaussian random variables. This theory was developed in the works [10, 12, 13] for the investigation of stochastic processes. In the book by Buldygin and Kozachenko [5] the properties of the space of square Gaussian random variables were studied.

So, at the beginning we investigate properties of quadratics forms of square Gaussian random
variables raised to some power *p* and then construct the test.
For these tests we do not need to compute the residuals and they can be
applied even in the case of infinite order representations.

This paper is the continuation of a series of works.
In the papers [14] and [9] we considered *centered*
univariate random sequences and constructed the goodness-of-fit
tests using different types of statistics: one was based on properties of maximum of square Gaussian random variables,
another one was built using inequalities for the Square Gaussian random variables raised to the power *p*.
The statistics utilizing maximum of square Gaussian random variables had been constructed for *non*-centered
univariate random sequences and for multivariate but *centered* random sequences in [11].

The paper consists of five sections and two annexes. The second section is devoted to
the theory of square Gaussian random variables and contains the
main definitions and results. Sections 3 and 4 consist of application of
the estimate obtained in Section 2 for construction different aggregated tests.
The criterion in Section 3 is
constructed for testing the aggregated hypothesis on expectation
and covariance function of the *non*-centered stationary
Gaussian sequence.
In Section 4 we consider the centered Gaussian multivariate
stationary sequences. The approach analyzing the residuals
dominates in multivariate case too. See, for example, papers by Hosking [7, 8],
Mahdi and McLeod [16] and the references therein. The goodness-of-fit test
we have constructed is based on fitting the covariance function.
Section 5 draws some conclusions.
Some necessary mathematical calculations are relegated to the annexes at
the end.

## 2 Square Gaussian random variables

Let

## Definition 2.1 ([13])

The space

where

It was proved by Buldygin and Kozachenko in [5] that

For the square Gaussian random variables the following results hold true.

## Theorem 2.2 ([10])

*Let *

*where *

*Let *

## Proof.

It can be easily seen that for all

Here the function *R* comes from Theorem 2.2.
∎

*Let *

*where
*

*for all *

## Proof.

Let us consider

The Minkowski inequality and (3) imply that

Therefore,

where

Let us investigate the behavior of the function

But since the Minkowski inequality is valid only for

So, we can minimize the right-hand side of inequality
(5) with regard to *r* assuming

Denoting

where

The function *f* attains its minimum value at the point

Inequality (4) holds true for δ satisfying the relation

The last condition is fulfilled as

## 3 Goodness-of-fit test for non-centered univariate sequences

The results obtained in the previous section can be utilized for construction the tests on joint hypothesis about an expectation and a covariance function of the non-centered univariate stationary Gaussian sequence.

Let us consider the stationary sequence

is its covariance function. Hereinafter we will consider stationarity in a strict sense.

We assume that we have

For every estimator above we can evaluate the quantities

Quantities (6) and (7) we can rewritten in the
following form (

where

So, the following random variables are square Gaussian:

and

Let us define the vectors

For any semi-definite matrix

are actually the quadratic forms of square Gaussian random variables.

Suppose that

(that is,

and

All necessary formulas for the terms of

Further on we consider the particular case when

*Consider the stationary sequence *

*The random variables
*

*where *

## Proof.

In our case when

Let us use the following notation:

Then

since for

Therefore

as

Denoting for simplicity

we will get

This implies the statement of the theorem. ∎

The desired property for

Using inequality (12), we can construct the goodness-of-fit test.

Let the null hypothesis

then the hypothesis

with

and
taking into account the restriction

## Proof.

The criterion follows from Theorem 3.3. ∎

The formulas for the evaluation of

## 4 Goodness-of-fit tests for the centered multivariate random sequences

Inequality (4) can also be useful for testing a
hypotheses on covariance
of centered *multivariate* random sequences.

Let us assume that the components of the multivariate random
sequence

is the covariance function of this multivariate
sequence. It is worth to mention that if *k*-th component and
when *covariance function of the sequence*

We suppose that the sequence

The estimator

The random variables

are square
Gaussian since

where the matrix is

Let

Let the null hypothesis

for any semi-definite matrices

then the hypothesis

## Proof.

The criterion follows from Theorem 2.4. ∎

The probability of type I error for Criterion 2 is less than or equal to α.

The simplest way is to choose the matrix

Let us consider the 2-component (

Let

and

If we choose the matrix

The needed formulas for evaluation of

## 5 Conclusions

In this paper we estimated the distribution of quadratic forms raised to the power *p* of
square Gaussian random variables. This result made it possible to
build the criterion for testing a hypothesis on expectation and
covariance function of the *non*-centered univariate stationary
Gaussian sequence and a hypothesis on the covariance function of the
centered multivariate stationary Gaussian sequence.

Our test statistics are quite easy to compute and do not require the calculation of residuals from the fitted model. This is especially advantageous when the fitted model is not a finite order autoregressive model.

There is, of course, a lot of room for improvement of the tests.
Comparison with other tests and finding the number *N* for which the
null and simple alternative hypotheses can be distinguishable are also
very important issues for further investigation.

## Annex 1

This annex includes the calculations needed in
Section 3. In
particular, the general formulas are given for

### For 𝖤 ( ξ N a ( m ) ) 2 :

We have

Using Isserlis’ formula for centered Gaussian random variables

Then

### For 𝖤 ( ξ N B ( m ) ) 2 :

We have

and

Using again Isserlis’ formula, we obtain that

where

and

Therefore

## Annex 2

In Section 4 we need to find the expectation of

Using Isserlis’ formula for the centered Gaussian random variables, we obtain

Then

**Funding statement: **The third author’s research was supported by the Visegrad Scholarship Program-EaP (No. 51601704).

### References

[1] Anderson T. W., The Statistical Analysis of Time Series, John Wiley & Sons, New York, 1971. Search in Google Scholar

[2] Box G. E. P., Jenkins G. M. and Reinsel G. C., Time Series Analysis: Forecasting and Control, 4th ed., John Wiley & Sons, Hoboken, 2011. Search in Google Scholar

[3] Box G. E. P. and Pierce D. A., Distribution of the residual autocorrelations in autoregressive integrated moving average time series models, J. Amer. Statist. Assoc. 65 (1970), 1509–1526. Search in Google Scholar

[4] Brockwell P. J. and Davis R. A., Time Series: Theory and Methods, 2nd ed., Springer Ser. Statist., Springer, New York, 2009. Search in Google Scholar

[5] Buldygin V. V. and Kozachenko Y. V., Metric Characterization of Random Variables and Random Processes, American Mathematical Society, Providence, 2000. Search in Google Scholar

[6] Chen W. W. and Deo R. S., A generalized portmanteau goodness-of-git test for time series models, Econometric Theory 20 (2004), no. 2, 382–416. Search in Google Scholar

[7] Hosking J. R. M., The multivariate portmanteau statistic, Statist. Sinica 75 (1980), 602–608. Search in Google Scholar

[8] Hosking J. R. M., Lagrange-multiplier tests of multivariate time-series models, J. R. Stat. Soc. Ser. B. Stat. Methodol. 43 (1981), no. 2, 219–230. Search in Google Scholar

[9]
Ianevych T. O.,
An

[10] Kozachenko Y. V. and Fedoryanych T. V., A criterion for testing hypotheses about the covariance function of a Gaussian stationary process, Theory Probab. Math. Statist. 69 (2004), 85–94. Search in Google Scholar

[11] Kozachenko Y. V. and Ianevych T. O., Some goodness of fit tests for random sequences, Lith. Math. J. Stat. 52 (2013), no. 1, 5–13. Search in Google Scholar

[12]
Kozachenko Y. V. and Stadnik A. I.,
Pre-Gaussian processes and convergence in

[13] Kozachenko Y. V. and Stus O. V., Square-Gaussian random processes and estimators of covariance functions, Math. Commun. 3 (1998), no. 1, 83–94. Search in Google Scholar

[14] Kozachenko Y. V. and Yakovenko T. O., Criterion for testing the hypotethis about the covariance function of the stationary Gaussian random sequence (in Ukrainian), Bull. Uzhgorod Univ. Ser. Math. Inform. 20 (2010), 39–43. Search in Google Scholar

[15] Ljung G. M. and Box G. E. P., On a measure on lack of fit in time series models, Biometrika 65 (1978), no. 2, 297–303. Search in Google Scholar

[16] Mahdi E. and McLeod A. I., Improved multivariate portmanteau test, J. Time Series Anal. 33 (2012), no. 2, 211–222. Search in Google Scholar

**Received:**2016-10-2

**Accepted:**2017-1-15

**Published Online:**2017-1-28

**Published in Print:**2017-3-1

© 2017 by De Gruyter