

Articles
by
Kayode Ayinde 
Total Records (
4 ) for
Kayode Ayinde 





O.O. Alabi
,
Kayode Ayinde
and
B.A. Oyejola


The effect of multicollinearity on the parameters of regression model using the Ordinary Least Squares (OLS) estimator is not only on estimation but also on inference. Large standard errors of the regression coefficients result in very low values of the tstatistic. Consequently, this study attempts to investigate empirically the effect of multicollinearity on the type 1 error rates of the OLS estimator. A regression model with constant term ( _{0}) and two independent variables (with _{1} and _{2} as their respective regression coefficients) that exhibit multicollinearity was considered. A Monte Carlo study of 1000 trials was conducted at 8 levels of multicollinearity (0, 0.25, 0.5, 0.7, 0.75, 0.8, 0.9 and 0.99) and sample sizes (10, 20, 40, 80, 100, 150, 250 and 500). At each specification, the true regression coefficients were set at unity. Results show that multicollinearity effect on the OLS estimator is not serious in that the type 1 error rates of _{0} is not significantly different from the preselected level of significance (0.05), in all the levels of multicollinearity and samples sizes and that that of _{1} and _{2} only exhibits significant difference from 0.05 in very few levels of multicollinearity and sample sizes. Even at these levels the significant level different from 0.06. 





Kayode Ayinde
and
B.A. Oyejola


The estimates of the OLS estimator of the Classical Linear Regression Model are known to be inconsistent when regressors are correlated with the error terms. However, this does not imply that inference is impossible. In this study, we compare the performances of the OLS and some Feasible GLS estimators when stochastic regressors are correlated with the error terms through Monte Carlo studies at both low and high replications. The performances of the estimators are compared using the following small sampling properties of estimators at various levels of correlation: bias, absolute bias, variance and more importantly the mean squared error of the model parameters. Results show that the OLS and GLS estimators considered in the study are equally good in estimating the model parameters when replication is low. However with increased replication, the OLS estimator is most efficient even though the performances of all the estimators exhibit no significant difference when the correlation between regressor and error terms tends to ±1. 




Kayode Ayinde


Assumptions in classical linear regression model that regressors are assumed to be independent and nonstochastic in repeated sampling are often violated by economist and other social scientists. This is because their regressors are generated by stochastic process beyond their control. Consequently, in this study we examine the performances of the Ordinary Least Square (OLS) and four Generalized Least Square (GLS) estimators of linear model with autocorrelated error terms when normally distributed stochastic regressors exhibit multicollinearity. These estimators are compared by examing their finite sampling properties at various levels of autocorrelation and nonvalidity of the multicollinearity assumption through MonteCarlo studies. Results show that the Maximum Likelihood (ML) and the Hildreth and LU (HILU) estimators are generally preferable in estimating all the parameters of the model at all the levels of autocorrelation and multicollinearity. Consequently, when the these two forms of correlations can not be ascertained in a data set, it is more preferable to use either the ML or HILU estimator to estimate all parameters of the model. 





Kayode Ayinde
and
R.A. Ipinyomi


Regressors are assumed fixed (nonstochastic) in repeated samples in the Classical Linear Regression Model. Situations where this assumption is not tenable are often found economics and other social sciences. In this study, we made a comparative study of the Ordinary Least Squares (OLS) and some Feasible Generalized Least Squares (GLS) estimators when normally distributed regressors are stochastic using Monte Carlo methods under both low and high replications. Comparison was done by examining the small sample performances of the estimators via bias, absolute bias, variance and more importantly the mean squared error of the estimated model parameters. Results show that the performances of the estimators improve with increased replication. The Maximum Likelihood (ML) and the Maximum Likelihood Grid (MLGD) estimators only compete with the OLS estimator when replication is low. However with increased replication, the OLS estimator is most efficient among the estimators in estimating all the parameters of the model. 





