Asymptotic normality and mean consistency of LS estimators in the errors-in-variables model with dependent errors

Abstract In this article, an errors-in-variables regression model in which the errors are negatively superadditive dependent (NSD) random variables is studied. First, the Marcinkiewicz-type strong law of large numbers for NSD random variables is established. Then, we use the strong law of large numbers to investigate the asymptotic normality of least square (LS) estimators for the unknown parameters. In addition, the mean consistency of LS estimators for the unknown parameters is also obtained. Some results for independent random variables and negatively associated random variables are extended and improved to the case of NSD setting. At last, two simulations are presented to verify the asymptotic normality and mean consistency of LS estimators in the model.


Introduction
To correct the effects of sampling errors, Deaton [1] proposed the errors-in-variables (EV) regression model, which is somewhat more practical than the ordinary regression model, and hence has attracted more attention.
In this article, we consider the following linear regression model: are unknown parameters or constants, ( ) ( ) … e ω e ω , , ,  . In the last few decades, many authors focused on investigating the EV models. In the case of independent random errors, the consistency of LS estimators in the linear EV model was established by Liu and Chen [2] and Chen et al. [3]; Miao et al. [4] obtained the central limit theorem for the LS estimators in the simple linear EV regression model; Xu and Li [5] studied the consistency of LS estimators in the linear EV regression model with replicate observations; Miao et al. [6] established some limit behaviors of estimators in the simple linear EV regression model; Miao and Yang [7] obtained the loglog law for the LS estimators in the EV regression model; Miao et al. [8] investigated the consistency and asymptotic normality of LS estimators in the simple linear EV regression model, and so on. In the case of dependent random errors, Fazekas and Kukush [9] obtained the consistency of the regression parameter of the nonlinear functional EV models under mixing conditions; Fan et al. [10] established the asymptotic properties for the LS estimators of the unknown parameters in the simple linear EV regression model with stationary-mixing errors; Miao et al. [11] derived the asymptotic normality and strong consistency of the estimators in the simple linear EV model with negatively associated (NA) errors; Miao et al. [12] obtained the weak consistency and strong consistency of LS estimators for the unknown parameters with martingale difference errors; Shen [13] studied some asymptotic properties of estimators in the EV model with martingale difference errors, and so forth. In this article, we consider model (1.1) under the assumptions that the random errors are negatively superadditive dependent (NSD) random variables whose concept was proposed by Hu [14] as follows. is NSD. Since then, a series of useful results on NSD sequences of random variables have been established. Hu [14] and Christofides and Vaggelatou [16] reported that the family of NSD sequences contains NA (in particular, independent) sequences and some more sequences of random variables which are not much deviated from being NA. Hu [14] gave an example illustrating that NSD sequences do not imply NA sequences (see ref. [17]). Moreover, Hu [14] derived some basic properties and three structural theorems of NSD. Eghbal et al. [18] provided two maximal inequalities and strong law of large numbers of quadratic forms of NSD random variables. Some Rosenthal-type inequalities for maximum partial sums of NSD sequences were established by Wang et al. [19]. The complete convergence and complete moment for arrays of rowwise NSD random variables were obtained by Meng et al. [20]. Amini et al. [21] investigated the complete convergence of moving average for NSD random variables. Yu et al. [22] studied the M-test in linear models with NSD errors. Zeng and Liu [23] established the asymptotic normality of difference-based estimator in a partially linear model with NSD errors. For more details about NSD random variables, one can refer to [24][25][26][27][28][29], and so on.
However, we have not found the studies on the asymptotic normality and mean consistency of LS estimators in the EV regression model with NSD random errors in the literature. In this article, we mainly investigate the asymptotic properties of the estimators for the unknown parameters in the simple linear EV regression model, which was proposed by Deaton [1] to correct the effects of sampling errors and is somewhat more practical than the ordinary regression model. Many authors have obtained the asymptotic properties of the estimators for the unknown parameters in the EV regression model for independent random errors, which are not reasonable in real practice. In this article, we assume that the random errors are NSD, which include independent and NA random variables as special cases. So to study model (1.1) with NSD random errors is of considerable significance. The main novelties of this article can be outlined as follows. First, the Marcinkiewicz-type strong law of large numbers for NSD random variables is established, which extends and improves the classical Marcinkiewicz-type strong law of large numbers for independent and identically distributed random variables to NSD random variables with non-identical distribution. Second, we use the strong law of large numbers to investigate the asymptotic normality of LS estimators for the unknown parameters. In addition, the mean consistency of LS estimators for the unknown parameters is also obtained. These results include the corresponding ones of independent random errors and some dependent random errors as special cases. Finally, two simulation studies are carried out to verify the validity of the results that we have established.
n of random variables is said to be stochastically dominated by a random variable X if there exists a positive constant C such that The remainder of this article is organized as follows. Under some mild conditions, the asymptotic normality and quadratic-mean consistency of LS estimators for the unknown parameters in model (1.1) with NSD random errors are established in Section 2. We give some preliminary lemmas in Section 3. We provide the proofs of the main results in Section 4. In Section 5, two simulations are carried out to study the numerical performance of the results that we have established.
Throughout this article, let C be a positive constant whose values may vary at different places. ⟶ P stands for convergence in probability, ⟶ d stands for convergence in distribution, and ⟶ a s . .

Main results
Model (1.1) to be studied can be exactly described as follows:

(2.3)
To obtain our results, the following conditions are sufficient.

Asymptotic normality
In this subsection, we state the asymptotic normality of LS estimators β n 1 and β n 0 for the unknown parameters β 1 and β 0 .
be both stationary NSD sequences of random variables, and they are independent with each other. Suppose that conditions (C 1 )-(C 4 ) hold. Then, where ( ) N 0, 1 represents the standard normal distribution.
be both strictly stationary NSD sequences of random variables, and they are independent with each other. Suppose that conditions (C 1 )-(C 3 ) and (C 5 )-(C 6 ) hold. Then, Remark 2.1. Since the family of NSD sequences of random variables includes independent and NA sequences, the results of Theorems 2.1 and 2.2 also hold for independent random errors and NA random errors.

Mean consistency
In this subsection, we state the quadratic-mean consistency of LS estimators β n 1 and β n 0 for the unknown parameters β 1 and β 0 .
be both stationary NSD sequences of random variables, and they are independent of each other. Suppose that conditions ( )

Preliminary lemmas
In this section, we present some important lemmas which will be used to prove the main results of the article.
are all non-decreasing or non-increasing functions, then (( ( ) ( ) … ( )) g X g X g X , , , n n From Lemmas 3.1 and 3.2, we can easily derive the following corollary.
are independent random vectors. If X and Z are both NSD, then ( + ni is an array of real numbers with n is uniformly integral in L 2 . In addition, the assumption .
n be a sequence of random variables which is stochastically dominated by a random variable X. For any > a 0 and > β 0, the following two statements hold: where C 1 and C 2 are positive constants. Thus, where C is a positive constant.

(3.5)
First, we will prove (3.4). It will be divided into the following two cases:   By the Markov inequality, Lemma 3.5 and | | < ∞ E X p , we have Since > m 0 is arbitrary and from (4.7), it follows that Hence, for any fixed m, Hence, for any fixed m, Therefore, (4.10) follows from ( ) C 7 . Similar to the proof of (4.9), one can get Hence,

Numerical simulations
In this section, we verify the asymptotic normality and mean consistency of LS estimators in the EV regression model with NSD errors.
By the definition of NA random variables (see [17]), we know that ( … ) e e e , , , n