INTRODUCTION
For spacecraft viewing the Earth, we require knowledge and control of the orientation and position of the spacecraft with respect to position on the Earth’s surface. This is particularly important for Earth remote sensing missions, where knowledge of the precise location of a sensed datum is critical. Typically pointing accuracies are required from 0.05° to better than a minute of arc. For a spacecraft in low Earth orbit, the Earth covers up to approximately 40% of the sky, and is an extended object, to a good approximation, as point sources .
Horizon scanners on the spacecraft are the principle means for directly determining the orientation of the spacecraft with respect to the Earth. The scanners uses an infrared sensor to detect the temperature difference between the Earth and space.
The horizon scanner consists of a mechanism for scanning a cone centered on the instrument’s bore sight, associated optics, a radiance detector (thermistor), and signal processing electronics. The scanner scans the Earth looking for horizon crossings (i.e., the edges of the warm Earth against the cold background of space).
The important task in the horizon sensor model is to know the shape of the Earth disk as observed by the spacecraft, for then we can compute the reference vector (the nominal pointing direction), and compare with the sensor observation to obtain the attitude (e.g., roll and pitch). For precise work, correction for oblateness must be applied, the Earth’s oblateness has two effects on the shape of the Earth. First, the Earth appears somewhat oblate rather than round, and second, the centre of the visible oblate Earth is displaced from the true geometric centre of the Earth.
EARTH OBLATENESS AND THE SHAPE OF THE HORIZON FROM SPACE
The shape of the Earth is not quite spherical, and for attitude determination it require a more accurate model. The basic model represents the Earth as an ellipsoid or oblate spheroid. Higher order approximations based on spherical harmonic expansions are used where necessary. In the ellipsoidal model, the ellipsoid is defined by the equatorial radius of the Earth R_{e} » 6378.140, the polar radius of the Earth R_{p}, and the flattening, f [1].
The shape of the Earth, as defined by the horizon, as seen by an Earth sensor sbe known. The horizon is defined as the point where the observer’s line of sight is tangent to the Earth’s surface, or perpendicular to the surface normal.
Using the ellipsoidal model, we may find the radius of the Earth at latitude λ by using:
Where, f is the flattening, and for infrared sensor which trigger on the atmosphere,
k represents seasonal and latitudinal variations in the height of the atmosphere,
and h represents the trigger height of the atmosphere for the sensor.
The ellipsoidal shape is described by:
or in terms of the flattening f by:
where a is the equatorial radius, and c is the polar radius. The normal
to this surface is given by the gradient, and so the normal unit vector is:

Fig. 1: 
Geometry of the Horizon Vector, H, and Surface Normal, N
for an oblate Earth" 
In Fig. 1 P_{(u,v,w)} represents the location of the observer, and R_{(x,y,z) }represents a point on the horizon. The vector from the observer to the horizon, known as the horizon vector r_{H} is given by:
Since R is a horizon point, then r_{H} must be perpendicular to N:
i.e.
To reveal the geometry of the situation we rearrange terms:
which, on comparison with the equation for the Earth ellipsoid, can be seen
to be the equation of another ellipsoid, scaled and displaced from the earth.

Fig. 2: 
Meridian cross section of the earth showing the horizon spheroid
and the horizon plane 
In Fig. 2 the horizon ellipsoid is the locus of all possible
horizon points, for all different planetary sizes for a given observer position.
The intersection of the two surfaces is the locus of the observer’s planetary
horizon, which is elliptical. The horizon ellipse lies on a plane known as the
horizon plane, and we obtain the equation of this plane by solving Eq.
4 and 9 simultaneously, and we obtain:
The normal to this plane is given by the direction u, v, w / (1f)^{2}, or in terms of geocentric latitude λ, and longitude Φ of the observer’s position:
The plane normal given by Eq. 11 is not in general coincident
with the nadir line of the observer. For an observer at a distance d
from the centre of the Earth, the possible horizon planes are parallel for a
given angular position, and they intersect the nadir line at D from the Earth’s
centre given by:
Where, R is the distance from the Earth’s centre to the sub satellite point on the surface. R is given by:
To establish the shape of the Earth as seen by the spacecraft sensor, we solve eqns (4) and (10) in the local co ordinate system defined by N,E, and Z through P.
We find that the angular radius of the Earth is given by:
Where,
is the geocentric latitude of the spacecraft position and d and R are the distance
from centre of Earth to the spacecraft, and the point on the Earth’s surface
below the satellite, respectively. The angle
is the azimuth angle of the horizon vector H, in local tangent coordinate,
and
is the angle between the nadir vector and the horizon vector.
SENSOR MEASUREMENTS
The determination of a spacecraft’s attitude is equivalent to determining the rotation between the satellite BodyFixed Frame (SBFF) and some known reference frame, such as the Earth Centered Inertial (ECI) frame. The Earth sensors are responsible for delivering the nadir pointing vector to the Attitude Determination System (ADS) expressed in the SBFF frame ^{[2,4]}.
Figure 3 illustrates the various coordinates frames defines above, the ECI frame is denoted by the axes X_{i },Y_{i}, Z_{i. }The Satellite Reference Frame (SRF), denoted X_{r },Y_{r}, Z_{r}, is based in a coordinate transformation from the ECI frame, and is updated continuously throughout each orbit. The SBFF shown as X_{b}, Y_{b}, Z_{b} represents the true attitude of the satellite.

Fig. 3: 
Inertial, satellite reference and satellite bodyfixed coordinates
frames for sensor measurements 
An attitude determination algorithm is then used to find a rotation matrix from the SBFF to ECI frame, denoted as R^{bi} such that :
The attitude determination analyst needs to understand how various sensors measure the body  frame components, how mathematical models are used to determine the inertialframe components, and how standard attitude determination algorithms are used to estimate R^{bi}.
ORBIT MODEL
The attitude motion is approximately decoupled from orbital motion, so that the two subjects are typically treated separately. More precisely, the orbital motion does have a significant effect on the attitude motion, but the attitude motion has a less significant on the orbital motion. For this reason orbital dynamics is normally covered first, and is a prerequisite topic for attitude dynamics. In our work, the satellite position in Earth Centered Inertial (ECI) coordinates is predicted using a Standard General Perturbation (SGP4) type model described by Hoots ^{[3]}.
A graphical User Interface (GUI) was created using MATLAB to predict the orbit
position, the algorithm uses the SGP4 model and takes the classical orbit parameters
and the time as input, as shown in Fig. 4 and 5^{
[4]}.

Fig. 4: 
Simulation of satellite Earth orbiting 

Fig. 5: 
Configuration GUI window 
The Mathematical Model Analysis: Attitude determination uses a combination of sensors and mathematical models to collect vector components in the body and inertial reference frames. These components are used in one of several different algorithms to determine the attitude, typically in the form of a quaternion, Euler angles, or a rotation matrix ^{[2, 5]}.
The horizon vectors modelled as shown in Fig. 6 17
below were obtained from a –X and Y looking pair of sensors. The oblateness
model of the Earth was used to compute the true horizon angle below the XY plane
^{[2]}.

Fig. 6: 
X vector component for satellite Xaxis horizon sensor in
body frame 

Fig. 7: 
Y vector component for satellite Xaxis horizon sensor in
body frame 

Fig. 8: 
Z vector component for satellite Xaxis horizon sensor in
body frame 

Fig. 9: 
X vector component for satellite Yaxis horizon sensor in
body frame 

Fig. 10: 
Y vector component for satellite Yaxis horizon sensor in
body frame 

Fig. 11: 
Z vector component for satellite Yaxis horizon sensor in
body frame 

Fig. 12: 
X vector component for satellite Xaxis horizon sensor in orbit frame 

Fig. 13: 
Y vector component for satellite Xaxis horizon sensor in
orbit frame 

Fig. 14: 
Z vector component for satellite Xaxis horizon sensor in
orbit frame 

Fig. 15: 
Z vector component for satellite Xaxis horizon sensor in
orbit frame 

Fig. 16: 
X vector component for satellite Yaxis horizon sensor in
orbit frame 

Fig. 17: 
Z vector component for satellite Yaxis horizon sensor in
orbit frame 
CONCLUSION
There are two basic classes of attitude sensors. The first class makes absolute measurements, whereas the second class makes relative measurements. Absolute measurement sensors are based on the fact that knowing the position of a spacecraft in its orbit makes it possible to compute the vector directions, with respect to an inertial frame, of certain astronomical objects, and of the force lines of the Earth's magnetic field.
Absolute measurement sensors measure these directions with respect to a spacecraft or body fixed reference frame, and by comparing the measurements with the known reference directions in an inertial reference frame, are able to determine (at least approximately) the relative orientation of the body frame with respect to the inertial frame.
The attitude determination problem involves using two or more sensors to measure the components of distinct reference vectors in the body frame, and using mathematical models to calculate the components of the same reference vectors in an inertial frame. These vectors are then used in an algorithm to estimate the attitude representations, usually a rotation matrix, a set of Euler Angles, or a quaternion.