dc.description.abstract |
The ordinary least squares method is considered as one of the most important ways of estimating the parameters of the general linear model because of its ease and simplicity and rationality of the results that obtained when the specific assumptions are achieved regarding the general linear model about error term and explanatory variables which are supposed to be orthogonal.
Yet if these assumptions are not verified , the ordinary least squares method will give undesirable results , and there appears the problem of inaccurate estimation , one of which is associated with the autocorrelation of errors which occurs when the value of the error term in any particular period is correlated with its own preceding value or values [ E(Ut Ut-s) = 0 s ≠0 ] , multicollinearity is another significant problem , this occurs when the explanatory variables are intermingled .
The objective of this study is to discusses the case of the model that suffered from both problems autocorrelation ( AR(1) and AR(2) ) and multicollinearity, we used simulation technique to create data containing these problems at the same time , and we used several statistical methods in order to deal with these problems , for example the two stages least squares procedure (TS) used to deal with problem of autocorrelation , and the biased estimation methods ,ridge regression (RR) , principal components (PC) and latent roots (LR) used to deal with the problem of multicollinearity of the data that had originally treated in consideration of autocorrelation. Moreover we used the evaluation methods as a base for the process of evaluation and comparison.
Throughout the simulation experiment results domain we concluded that dealing with autocorrelation from data that suffered from multicollinearity, multicollinearity increases when the error term follows first or second order autoregressive scheme. Whereas, multicollinearity decreases if the model has a few explanatory variables. Among the biased estimation methods if we take S2 and R2 as criteria of comparison, we find that latent root regression is the best when the error term follows first order autoregressive scheme with high value of autocorrelation coefficient and a model has too many explanatory variables. Whereas, principal components is the best if a model has a few explanatory variables , and if we take MSE as a criteria of comparison we find that latent root regression and principal components are the best when the error term follows first order or second autoregressive schemes. Among the types of ridge regression method , if we take the MSE as a criteria of comparison we find that ordinary ridge regression is the best when the sample size is too large, otherwise , generalized ridge regression is the best one. |
en_US |