RECORD DETAIL


Back To Previous

UPA Perpustakaan Universitas Jember

Uniform integrability of the OLS estimators, and the convergence of their moments

No image available for this title
The problem of convergence of moments of a sequence of random variables
to the moments of its asymptotic distribution is important in many applications. These
include the determination of the optimal training sample size in the cross-validation
estimation of the generalization error of computer algorithms, and in the construction
of graphical methods for studying dependence patterns between two biomarkers. In
this paper, we prove the uniform integrability of the ordinary least squares estimators
of a linear regression model, under suitable assumptions on the design matrix and
the moments of the errors. Further, we prove the convergence of the moments of the
estimators to the corresponding moments of their asymptotic distribution, and study the
rate of the moment convergence. The canonical central limit theorem corresponds to the
simplest linear regression model. We investigate the rate of the moment convergence
in canonical central limit theorem proving a sharp improvement of von Bahr’s (Ann
Math Stat 36:808–818, 1965) theorem.

Availability
EB00000004430KAvailable
Detail Information

Series Title

-

Call Number

-

Publisher

: ,

Collation

-

Language

ISBN/ISSN

-

Classification

NONE

Detail Information

Content Type

E-Jurnal

Media Type

-

Carrier Type

-

Edition

-

Specific Detail Info

-

Statement of Responsibility

No other version available