Abstract:
Regression Analysis is one of (he most widely used statistical techniques for analyzing
multifactor data. Its broad appeal results from the conceptually simple process of using
an equation to express the relationship between a set of variables. Regression analysis is
also interesting theoretically because of the elegant underlying mathematics. Successful
use of regression analysis requires an appreciation of both the theory and the practical
problems (hat often arise when the technique is employed with real world data.
In the model fitting process the most frequently applied and most popular estimation
procedure is the Ordinary Least Square Estimation (OLSE). The significant advantage of
OLSE is that it provides minimum variance unbiased linear estimates for the parameters
in the linear regression model.
In many situations both experimental and non-experimental, the independent variables
tend to be correlated among themselves. Then inter-correlation or multicollinearity among
the independent variables is said to be exist. A variety of interrelated problems are created
when multicollinearity exists. Specially, in the model building process, multicollinearity
among the independent variables causes high variance (if OLSE is used) even though the
estimators are still the minimum variance unbiased estimators in the class of linear unbiased
estimators.
The main objective of this study is to show that the unbiased estimation does not mean
good estimation when the regressors are correlated among themselves or multicollinearity'
exists. Instead, it is tried to motivate the use of biased estimation (Ridge type estimation)
allowing small bias and having a low variance, which together can give a low mean
square error.
This study also reveals the importance of the theoretical results already obtained, and
gives a path for a researcher for the application of the theoretical results in practical
situations.
Keywords: Multicollinearity, Least Square Estimation, Restricted Least Square