INTRODUCTION
Regression analysis is often used in the analysis of survey data in situations where complex sampling designs are employed. Okafor (2002) highlighted that regression estimation method uses auxiliary information to improve the estimates of the population parameters such as the mean and total. He further noted that the regression estimation is used to estimate the population mean when the regression line of y on x does not pass through the origin but makes an intercept along the yaxis.
Matloff (1981) obtained the estimate of the unconditional mean μ of the variable Y from a random sample of a population having a distribution F_{xy}. He compared this estimate with that obtained from the linear regression of Y, the dependent variable on X, the auxiliary (independent) variable through averaging of the estimated regression function values at the sample points. He found out that the latter estimator offered substantial improvement over the sample mean of Y than the former.
In another development, Jewell and Queensberry (1986) considered an iterative regression method from data collected from stratified samples using a design variable, which is correlated with the dependent variable but is not included as an independent variable. They showed that the method gave a general superior estimate of the mean in terms of efficiency.
In all these research, the linear regression of a variable of interest Y (dependent
variable) on the auxiliary X (independent variable) was considered for the estimation
of population parameters. This study therefore seeks to consider a case where
the variable of interest Y has a polynomial (nonlinear) relationship with the
auxiliary variable X, which is the independent variable. The study considers
mainly the polynomial model of order two given by
Y = β_{0}+β_{1}X+β_{2}X^{2}+E 
Ratkowsky (1983) indicated that numerous nonlinear models such as Weilbulltype model had been used to model sigmoidal growth curves widespread in biology, agriculture, engineering and economics. For a standard nonlinear regression analysis, a correct regression model is often assumed to exist. This is a model whose algebraic sum of the error is zero. Statistical techniques for estimating the model parameters have been developed under the assumption of independent observations of Draper and Smith (1981). Asymptotic properties of least squares estimators have been discussed extensively in the literature such as Gallant (1987) and Wu (1981).
Bunke (1993) showed a high order asymptotic equivalence between extended Jacknife and asymptotic estimates in nonlinear regression model. Hung (1985) considered regression estimation with transformed auxiliary variates.
MATERIALS AND METHODS
A simple random sample of size 100 is selected by simple random sampling without replacement (srswor) from the generated sets of dependent (Y) and independent (X) variables.
Derivation of estimators of the population parameters (mean and total) and their corresponding variances: Given the general quadratic equation.
(1) 
can be transformed into an intrinsically linear model as: 
Where:
therefore the unbiased estimator of the mean is given by:
its variance is given by:
but
Also
Therefore, which gives:
We need to estimate β_{1 }and β_{2 }such that V() is a minimum. By the method of ordinary least squares, we differentiate partially
(7) with respect to β_{1} and β_{1} to obtain the
following normal equations:
Solving (9) and (10) simultaneously, we obtain
or
and the total estimate and its variance are given as:
To determine the efficiency and precision of the above method, the estimates will be compared with the estimates from Linear regression and elemental sampling method. Any method with the least variance gives the method with a more efficient and precise estimate.
We note here that s_{xy, }s_{zy}, s_{xz} are estimators of the population covariances S_{xy}, S_{zy} and S_{xy}, respectively; while variances s_{x}^{2}, s_{y}^{2}, s_{z}^{2} are unbiased estimators of the population S_{x}^{2 }, S_{y}^{2}, S_{z}^{2}, respectively.
DATA ANALYSIS
s_{xy }= 3271.758, s_{zy }= 37433.466, s_{zx}
= 16.208
s^{2}_{x} = 1.4462, s^{2}_{y} = 8046950.712,
s^{2}_{z} = 185.760

From Eq. 13 and 14.
Also the variance of is obtained by Eq. 8.
The means and their variances in Linear regression and elemental sampling method are:
For elemental sampling:
Their total estimates are given as follows:
RESULTS AND DISCUSSION
From Table 1, the polynomial (quadratic) regression method
provides estimates with the least variances and this is followed by those of
linear regression method. Therefore, the polynomial regression method gives
the most efficient estimates of the population parameters (Mean and Total).
The efficiency and precision of using Regression Estimation Method depend on
the degree of relationship between the auxiliary variable and the variable of
interest (independent variable). It also depends on their nature or pattern
of relationship and the sample size. Suppose that the appropriate relationship
between the auxiliary and independent variable is a polynomial one and one mispecifies
the model and estimates the population parameters using the Linear regression
estimation method, it will lead to an inefficient or less efficient parameters
estimation.
Table 1: 
Summary of estimates of means, totals and their variances 

A similar thing will be obtained when the degree of relationship is low. It
is also noteworthy that the appropriate order of polynomial is imperative for
a polynomial relationship.
This research at a variable that has a polynomial relationship with an auxiliary
variable will have its parameters well estimated using the Polynomial regression
method which will give more efficient and precise estimates.
CONCLUSION
Polynomial regression estimation method provides a way of making good, efficient and precise estimates of characteristics of variables that have polynomial relationships with their auxiliary variables. It is therefore necessary for one to understand the nature and pattern of relationship between a variable of interest and the independent variable before choosing a method for estimation of parameters.