Show Summary Details

Open Mathematics

formerly Central European Journal of Mathematics

Editor-in-Chief: Gianazza, Ugo / Vespri, Vincenzo

1 Issue per year

IMPACT FACTOR 2015: 0.512

SCImago Journal Rank (SJR) 2015: 0.521
Source Normalized Impact per Paper (SNIP) 2015: 1.233
Impact per Publication (IPP) 2015: 0.546

Mathematical Citation Quotient (MCQ) 2015: 0.39

Open Access
Online
ISSN
2391-5455
See all formats and pricing

Various limit theorems for ratios from the uniform distribution

Yu Miao
• Corresponding author
• College of Mathematics and Information Science, Henan Normal University, Henan Province, 453007, China
• Email:
/ Yan Sun
• Science College, Beijing Information Science and Technology University, Beijing, 100192, China
• Email:
/ Rujun Wang
• College of Mathematics and Information Science, Henan Normal University, Henan Province, 453007, China
• Email:
/ Manru Dong
• College of Mathematics and Information Science, Henan Normal University, Henan Province, 453007, China
• Email:
Published Online: 2016-06-11 | DOI: https://doi.org/10.1515/math-2016-0037

Abstract

In this paper, we consider the ratios of order statistics in samples from uniform distribution and establish strong and weak laws for these ratios.

MSC 2010: 60F15; 62G30

1 Introduction

If the random variables X1, …, Xn are arranged in order of magnitude and then written X(1) ≤ ··· ≤ X(n), or, in more explicit notation, Xn,1 ≤ ·· Xn,n, we call Xn,k the kth order statistics (i = 1, ···, n). Order statistics might have a significant role in many settings, for example, robust location estimates, detection of outliers, strength of materials, reliability, quality control, selecting the best, and so on [1, 2]. If X1, ···, Xn have a common distribution F which is continuous, then with probability 1 the order statistics of the sample take distinct values. The exact distribution of the kth order statistic Xnk is easily found but cumbersome to use:

$P(Xn,k≤x)=∑i=kn(ni)[F(x)]i[1−F(x)]n−i, −∞

In the theory of order statistics, the uniform distribution plays an important role. For instance, let us introduce independent identically distributed random variables η1, ···, ηn and ξ1, ···, ξn where the ηi are (0, 1)-uniformly distributed and the ξi have the common distribution F. Then the following two relations hold:

$(η1,⋯,ηn)=d(F(ξ1),⋯,F(ξn))$

if F is continuous and

$(ξ1,⋯,ξn)=d(F−1(η1),⋯,F−1(ηn))$

where F–1 is quantile function. Let Un,1 ≤ ··· ≤ Un,n and Xn,1 ≤ ··· ≤ Xn,n be the order statistics of η1, ··, ηn and ξ1, ···, ξn, then one obtains

$(Un,1,⋯,Un,n)=d(F(Xn,1),⋯,F(Xn,n))$

and

$(Xn,1,⋯,Xn,n)=d(F(Un,1),⋯,F(Un,n)).$

In general, spacings of uniform random variables cannot be independent. However it was shown by Malmquist [3] that certain ratios of order statistics Uni of uniform random variables are independent, namely,

$1−Un,1,1−Un,21−Un,1,⋯,1−Un,n1−Un,n−1$

are independent random variables and

$1−Un,r1−Un,r−1=dUn−r+1,n−r+1, r=1,⋯,n,$

(with the convention that Un0 = 0).

In this paper, we consider the following ratios of order statistics from uniform distribution.

$Rnij=Xn(j)Xn(i),1≤i

where Xn(k) denote the order statistics of the samples (Xn1, ···, Xnmn from the uniform distribution on [an, bn] for every n = 1, 2, ···. Some limit theorems for Rnij will be established and most of the results presented are new to the best of our knowledge. Some similar results from exponentials have been studied by Adler [4] and Miao et al. [5].

2 Main results

Assume that {Xni, i = 1, 2, ···, mn} are independent identically distributed random variables with uniform distribution on [an, bn] for every n = 1, 2, ···. Denote {Xn(k), k = 1, 2, ··, mn} as the order statistics of the random variables {Xni, i = 1, 2, ···, mn} and let

$Rnij=Xn(j)Xn(i),1≤i

denote the ratio of the order statistics {Xn(k), k = 1, 2, ···, mn}. Intuitively, if an = 0, then the ratio Rnij takes value in [1, ∞) and if an > 0, then the ratio Rnij takes value in [1, bn/an]. Hence, we shall discuss the properties of the ratio Rnij based on the different cases. Firstly we establish the properties of Rn12.

2.1 Properties of Rn12

Theorem 2.1: Let 0 = an < bn, then the density function of the ratio Rn12 is$f(r)=1r2I(r>1)$which is independent of n.Proof. It is easy to see$f(x1,x2)=mn(mn−1)bnmn(bn−x2)mn−2I(0≤x1≤x2≤bn).$Let w =x1, $r=\frac{{x}_{2}}{{x}_{1}}$, then the Jacobian is w and it is not difficult to get the joint probability density function of w and r is$f(w,r)=mn(mn−1)bnmnw(bn−rw)mn−2I(w≥0,r>1,rw≤bn),$then we have$f(r)=mn(mn−1)bnmn∫0bnrw(bn−rw)mn−2dw=mn(mn−1)bnmnr2∫bn0(umn−1−bnumn−2)du=1r2.$

Remark 2.2: Obviously, the expectation of Rn12 is infinite and the β-order moment is finite for 0 < β 1.

Theorem 2.3: Let 0 = an < bn, for any mn → ∞ and all α > –2,$limN→∞∑n=1N(lnn)αXn(2)nXn(1)(lnN)α+2=1α+2 almost surely.$

Remark 2.4: Theorem 2.3 extends the result in Adler [6, Theorem 3.1], which proved the same result for the sample from the uniform distribution on [0, p]. Since the proof is similar to Adler, we omit it.

Theorem 2.5: Let 0 < an < bn, then the density function of the ratio Rn12 is$f(r)=(bn−anr)mn−1[(mn−1)anr+b(bn−an)mnr2I(1Proof. By a straightforward computation we have$f(x1,x2)=mn(mn−1)(bn−an)mn(bn−x2)mn−2I(an≤x1≤x2≤bn).$Let w = x1, $r=\frac{{x}_{2}}{{x}_{1}}$, then the Jacobian is w, it is easy to get the joint probability density function of wand r is$f(w,r)=mn(mn−1)(bn−an)mnw(bn−rw)mn−2I(w≥an,r>1,rw≤bn),$then we have$f(r)=mn(mn−1)(bn−an)mn∫anbnrw(bn−rw)mn−2dw=mn(mn−1)(bn−an)mnr2∫bn−anr0(bnumn−2−umn−1)du=(bn−anr)mn−1[(mn−1)anr+bn](bn−an)mnr2,$which completes the proof of the theorem. □

Theorem 2.6: Let 0 < an < bn, the expectation of the ratio Rn12 is$E(Rn12)=1+1(bn−an)mn∫1bnan(bn−anr)mnrdr=:1+Δn,$where$bn−anbn(mn+1)≤Δn≤bn−anan(mn+1)$In particular, if as n → ∞,$bn−ananmn→0,$then we have$E(Rn12)→1.$Proof. By Theorem 2.5 we have$E(Rn12)=1(bn−an)mn∫1bnan(bn−anr)mn−1[(mn−1)anr+bn]rdr=1(bn−an)mn∫1bnananmn(bn−anr)mn−1+(bn−anr)rdr=1+1(bn−an)mn∫1bnan(bn−anr)mnrdr,$which yields the theorem. □

Theorem 2.7: Let 0 < an < bn, the second moment of the ratio Rn12 is$E(Rn122)=1+2(bn−an)an(mn+1)$(1)and the variance of Rn12 has the following estimate$Var(Rn12)≤(bn−an)2(mn+1)bn(2an−1bn(mn+1)).$Proof. From Theorem 2.5 and Theorem 2.6, it is easy to obtain the desired results.

Remark 2.8: From Theorem 2.7, when $\frac{{b}_{n}-{a}_{n}}{{a}_{n}{m}_{n}}\to 0$, then Var(Rn12) → 0.Based on the above theorems for the cases 0 < an < bn, the following central limit theorem for Rn12 holds.

Theorem 2.9: Let 0 < an < bn, then we have$1N∑n=1NRn12−ERn12Var(Rn12)→dN(0,1).$Proof. By the Liapounov’s condition, the theorem holds. □

2.2 Properties ofRn23

In this subsection, we discuss the properties of the ratio of the second and third order statistics from an independent identically distributed sample of uniform distribution.

Theorem 2.10: Let 0 = an < bn, then the density function of the ratio Rn23 is$f(r)=2r3I(r>1)$which is independent of n.Proof. It is not difficult to obtain$f(x2,x3)=mn!(mn−3)!bnmnx2(bn−x3)mn−3I(0≤x2≤x3≤bn).$Let w = x2, $r=\frac{{x}_{3}}{{x}_{2}}$, then the Jacobian is w and it is easy to get the joint probability density function of w and r is$f(w,r)=mn!(mn−3)!bnmnw2(bn−rw)mn−3I(w≥0,r>1,rw≤bn),$then we have$f(r)=mn!(mn−3)!bnmn∫0bnrw2(bn−rw)mn−3dw=mn!(mn−3)!bnmn∫bn0(bn−ur)umn−3−dur=mn!(−1)(mn−3)!bnmnr3∫bn0(bn2umn−3+umn−1+2bnumn−2)du=2r3.$which completes the proof. □

Remark 2.11: It is not difficult to check that for any 0 < δ < 2, the δ-order moment of Rn23 exists and the second moment is infinite. In particular, E(Rn23) = 2.

Theorem 2.12: Let 0 = an < bn, then we have$1N∑n=1N(Rn23−ERn23)→dN(0,1).$Proof. Let us define L(x) := E(ξ21{|ξ| ≤ x} and$cN=1∨sup{x≥0;NL(N)≥x2}, N∈ℕ.$It is easy to see that ${c}_{N}=\sqrt{N}$, so by Theorem 4.17 in [7], the desired result can be obtained. □

Theorem 2.13: Let 0 < an < bn, then the density function of Rn23 is$f(r)=(bn−anr)mn−1[(mn−2)anr+2bn](bn−an)mnr3I(1Proof. It is easy to get$f(x2,x3)=mn!(mn−3)!(bn−an)mn(x2−an)(bn−x3)mn−3I(an≤x2≤x3≤bn).$Let w = x2, $r=\frac{{x}_{3}}{{x}_{2}}$, then the Jacobian is w and it is not difficult to get the joint probability density function of w and r is$f(w,r)=mn!(mn−3)!(bn−an)mnw(w−an)(bn−rw)mn−3I(w≥an,r>1,rw≤bn),$then we have$f(r)=mn!(mn−3)!(bn−an)mn∫anbnrw(w−an)(bn−rw)mn−3dw=mn!(−1)(mn−3)!(bn−an)mnr3∫bn−anr0[umn−1+(anr−2bn)umn−2+(bn−anr)bnumn−3]du=(bn−anr)mn−1[(mn−2)anr+2bn](bn−an)mnr3,$which completes the proof. □

From Theorem 2.13, we know that for all δ > 0, the δ-order moments of Rn23 exist. However, some moments can not be expressed explicitly.

Theorem 2.14: Let 0 < an < bn, the expectation of Rn23 is$E(Rn23)=1+bn−anan∫01(1−t)mn(bn−anant+1)2dt.$In particular, if as n → ∞,$bn−ananmn→0,$then we have$E(Rn23)→1.$Proof. By using Theorem 2.13 we obtain$E(Rn23)=1(bn−an)mn∫1bnan(bn−anr)mn−1[(mn−2)anr+2bn]r2=1(bn−an)∫1bnan[anmn(bn−anr)mn−1r+2(bn−anr)mnr2]dr=an(bn−an)mn∫anbn[mn(bn−u)mn−1u+2(bn−u)mnu2]du=∫01mn(1−t)mn−1bn−anant+1dt+∫0t2(1−t)mnbn−anan(t+anbn−an)2dt=2−∫01mn(1−t)mn−1bn−anant+1dt=1+bn−anan∫01(1−t)mn(bn−anant+1)2dt$where we use the variable replacement $t:=\frac{u-{a}_{n}}{{b}_{n}-{a}_{n}}$ in the fourth equality. □

Theorem 2.15: Let 0 < an < bn, the 2-order moment of the ratio Rn23 is$E(Rn232)=1+2∫01(1−t)mnt+an/(bn−an)dt.$In particular, if as n → ∞,$bn−ananmn→0,$then we have$E(Rn232)→1.$Proof. By Theorem 2.13 we have$E(Rn232)=1(bn−an)mn∫1bnan(bn−anr)mn−1[(mn−2)anr+1bn]rdr=1(bn−an)mn∫1bnan[anmn(bn−anr)mn−1+2(bn−anr)mnr]dr=1+2(bn−an)mn∫1bnan(bn−anr)mnrdr=1+2(bn−an)mn∫anbn(bn−u)mnudu=1+2∫01(1−t)mnt+an/(bn−an)dt (t:=u−anbn−an).$

Remark 2.16: From Theorem 2.14 and Theorem 2.15, we know that$Var(Rn23)→0.$

2.3 Properties of Rn1j

In this subsection, we establish the properties of the ratio of the first and j th order statistics from an independent identically distributed sample of uniform distribution with parameters an and bn, the sample size is fixed as m.

Theorem 2.17: Let 0 = an < bn, then the density function of Rn1j is$f(r)=m!(r−1)j−2(j−2)!(m−j)!rj∑k=0m−j(m−jk)(−1)kj+kI(r>1).$Proof. It is easy to see$f(x1,xj)=m!(j−2)!(m−j)!bnm(xj−x1)j−2(bn−xj)m−jI(0≤x1≤xj≤bn).$Let w = x1, $r=\frac{{x}_{j}}{{x}_{1}}$, then the Jacobian is w and it is not difficult to get the joint probability density function of w and r is$f(w,r)=m!(j−2)!(m−j)!bnmwj−1(r−1)j−2(bn−rw)m−jI(w≥0,r>1,rw≤b),$then we have$f(r)=m!(r−1)j−2(j−2)!(m−j)!bnm∫0bnrwj−1(bn−rw)m−jdw=m!(r−1)j−2(j−2)!(m−j)!bm∑k=0m−j(m−jk)bnm−j−k(−1)krk∫0bnrwj−1+kdw=m!(r−1)j−2(j−2)!(m−j)!rj∑k=0m−j(m−jk)(−1)kj+k,$which completes this proof. □

Theorem 2.18: Let 0 = an < bn, for all α>–2,$limN→∞∑n=1N(lnn)αXn(j)nXn(1)(lnN)α+2=m!(j−2)!(m−j)!(α+2)∑k=0m−j(m−jk)(−1)kj+k almost surely.$Proof. Let zn = (ln n)α/n, dn = (ln n)α+2 and cn = dn/zn = n(ln n)2. Denote Rn1j = Xn(j) = Xn(1), then we find that the density function of Rn1j is$f(r)=m!(r−1)j−2(j−2)!(m−j)!rj∑k=0m−j(m−jk)(−1)kj+kI(r>1).$Next we use the partition$1dN∑n=1NznRn1j=1dN∑n=1Nzn[Rn1jI(1≤Rn1j≤cn)−ERn1jI(1≤Rn1j≤cn)]+1dN∑n=1NznRn1jI(Rn1j>cn)+1dN∑n=1NznERn1jI(1≤Rn1j≤cn).$Combining with the the Khintchine-Kolmogorov convergence theorem (see Chow and Teicher [8, Theorem 1 in Page 113]), Kronecker’s lemma (see Chow and Teicher [8, Lemma 2 in Page 114]) and the following analysis$∑n=1∞1cn2ERn1j2I(1≤Rn1j≤cn)=∑n=1∞1cn2m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]∫1cn(r−1r)j−2dr≤C∑n=1∞1cn2∫1cndr≤C∑n=1∞1cn=C∑n=1∞1n(lnn)2<∞,$we find that the first term vanishes almost surely. By the Borel-Cantelli lemma and the following analysis$∑n=1∞P{Rn1j>cn}=∑n=1∞m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]∫cn∞(r−1r)j−2dr≤C∑n=1∞∫cn∞(r−1)j−2rjdr=C∑n=1∞∫cn∞(r−1r)j1(r−1)2dr≤C∑n=1∞∫1cndr(r−1)2=C∑n=1∞1cn−1<∞,$the second term vanishes. For the third term in the partition, we have$ERn1jI(1≤Rn1j≤cn)=m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]∫1cn(r−1)j−2rj−1dr=m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]∑t=0j−2(j−2t)(−1)j−2−t∫1cnrt+1−jdr=m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k][∑t=0j−2(j−2t)(−1)j−2−tcnt+2−j−1t+2−j+lncn]~m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]lncn~m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]lnn,$then we obtain$1dN∑n=1NznERn1jI(1≤Rn1j≤cn)~m!(j−2)!(m−j)![∑k=0m−j(m−jk)(−1)kj+k]∑n=1N(lnn)α+1n(lnN)α+2 →m!(j−2)!(m−j)!(α+2)∑k=0m−j(m−jk)(−1)kj+k,$so, the theorem holds. □

Theorem 2.19: Let 0 < an < bn, then the density function of Rn1j is$f(r)=Cn(r)∑k=0m−j(m−jk)(−1)kbnm−j−kk+j(bnk+j−ank+jrk+j)I(1where$Cn(r):=Cm,an,bn,j(r):=m!(r−1)j−2(j−2)!(m−j)!(bn−an)mrj.$Proof. It is not difficult to obtain$f(x1,xj)=m!(j−2)!(m−j)!(bn−an)m(xj−x1)j−2(bn−xj)m−jI(an≤x1≤xj≤bn).$Let w = x1, $r=\frac{{x}_{j}}{{x}_{1}}$, then the Jacobian is w and it is easy to get the joint probability density function of w and r is$f(w,r)=m!(j−2)!(m−j)!(bn−an)mwj−1(r−1)j−2(bn−rw)m−jI(w≥an,r>1,rw≤bn),$then we have$f(r)=m!(r−1)j−2(j−2)!(m−j)!(bn−an)m∫anbnrwj−1(bn−rw)m−jdw=m!(r−1)j−2(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)bnm−j−k(−r)k∫anbnrwk+j−1dw=m!(r−1)j−2(j−2)!(m−j)!(bn−an)mrj∑k=0m−j(m−jk)(−1)kbnm−j−kk+j(bnk+j−ank+jrk+j),$which completes this proof. □

Theorem 2.20: Let 0 < an < bn, the expectation of Rn1j is$E(Rn1j)=m!(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)(−1)kbnm−j−kk+j[bnk+jlnbnan+ank+j−bnk+jk+j+∑t=0j−3(j−2t)(−1)j−2−t(anj−t−2bnt+k+2−bnk+jt−j+2+ank+j−anj−t−2bnt+k+2t+k+2)].$Proof. By Theorem 2.19 we have$E(Rn1j)=m!(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)(−1)kbnm−j−kk+j∫1bnan(r−1)j−2(bnk+jrj−1−ank+jrk+1)dr=m!(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)(−1)kbnm−j−kk+j∑t=0j−2(j−2t)(−1)j−2−t∫1bnan(bnk+jrt−j+1−ank+jrt+k+1)dr=m!(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)(−1)kbnm−j−kk+j[∫1bnan(bnk+jr−ank+jrj+k−1)dr+∑t=0j−3(j−2t)(−1)j−2−t∫1bnan(bnk+jrt−j+1−ank+jrt−j+1)dr]m!(j−2)!(m−j)!(bn−an)m∑k=0m−j(m−jk)(−1)kbnm−j−kk+j[bnk+jlnbnan+ank+j−bnk+jk+j+∑t=0j−3(j−2t)(−1)j−2−t(anj−t−2bnt+k+2−bnk+jt−j+2+ank+j−anj−t−2bnt+k+2t+k+2)].$so, the theorem holds. □

Acknowledgement

This work is supported by IRTSTHN (14IRTSTHN023), NSFC (11471104, 71501016), Science and Technology Project of Beijing Municipal Education Commission (71E1610975).

References

• [1]

Arnold B. C., Balakrishnan N., Nagaraja H. N., A first course in order statistics, Philadelphia, PA, 2008.

• [2]

Reiss R. D., Approximate distributions of order statistics, Springer-Verlag, New York, 1989.

• [3]

Malmquist S., On a property of order statistics from a rectangular distribution, Skand. Aktuarietidskr., 33, 1950, 214-222.

• [4]

Adler A., Strong laws for ratios of order statistics from exponentials, Bull. Inst. Math. Acad. Sin. (N.S.), 2015, 10(1), 101-111.

• [5]

Miao, Y., Wang, R. J., Adler, A., Limit theorems for order statistics from exponentials. Statist. Probab. Lett., 2016, 110, 51-57.

• [6]

Adler, A., Laws of large numbers for ratios of uniform random variables. Open Math., 2015, 13, 571-576.

• [7]

Kallenberg O., Foundations of modern probability, Springer-Verlag, New York, 1997.

• [8]

Chow Y. S., Teicher H., Probability Theory: Independence, Interchangeability, Martingales, 3rd ed., Springer-Verlag, New York, 1997.

Accepted: 2016-05-27

Published Online: 2016-06-11

Published in Print: 2016-01-01

Citation Information: Open Mathematics, ISSN (Online) 2391-5455, Export Citation