**Guo Jing-Lei**

Department of Computer Science, Hua Zhong Normal University, Wuhan Hubei, 430079, China

Wu Zhi-jian

State Key Lab of Software Engineering, Wuhan University, Wuhan Hubei, 430072, China

Department of Computer Science, Hua Zhong Normal University, Wuhan Hubei, 430079, China

Wu Zhi-jian

State Key Lab of Software Engineering, Wuhan University, Wuhan Hubei, 430072, China

Inverse problem which arise widely in scientific and engineering areas is to find out unknown properties of the objects. The ill-posed and nonlinear nature of inverse problem causes the difficulties of solving such problems. An improved particle swarm optimization with new velocity updating equation and without velocity limit is proposed to solve the parameter identification of elliptic differential equation problem. When the evolution process falls in the stagnation state, new promising particles are generated randomly by multi-parents crossover operator. The numerical results show that hybrid PSO algorithm is effective to solve parameter identification problems of elliptic differential equation and is not very sensitive to noise.

PDF Abstract XML References Citation

Guo Jing-Lei and Wu Zhi-jian, 2011. Parameter Identifications of Elliptic Differential Equation by Hybrid Particle Swarm Optimization. *Information Technology Journal, 10: 152-157.*

**DOI:** 10.3923/itj.2011.152.157

**URL:** https://scialert.net/abstract/?doi=itj.2011.152.157

The aim of inverse problem is to find out an unknown property of an object, or a medium, from the observation of a response of the object (Ramm, 2005). Inverse problems arise widely in scientific and engineering areas (Isakov, 2006). For example, the inverse problem is to find the subsurface inhomogeneities by sending an acoustic wave in geophysics; in nondestructive testing, the inverse problem is to find a defect in the material by the Eigen frequencies. The nonlinear and ill-posed nature of inverse problem causes the instability of the solutions, i.e., slight changes in the observed data may lead to significant deviations in the solutions. So far, a variety of mathematical or physical methods is used to solve inverse problems (Ito and Kunish, 1985). In these methods, Pulse Spectrum Technology (PST) and regularization method are used widely. But PST method strongly depends on the selection of initial model. If the initial model is inappropriate, PST method easily falls into local optimal. Similarly, the effect of regularization method is decided by the choice of regularier. Much literature discusses the choice of regularization parameters (Ben-Yu and Zou, 2001; Huang and Zou, 2007; Li and Zou, 2007; Liu and Zou, 2007; Jin and Zou, 2008). Wu *et al*. (2004) originally proposed an elite-subspace evolutionary algorithm to solve parameter identification of inverse problems and the measure noise was taken into consideration firstly.

Particle swarm optimization algorithm which simulates the social insects’ behaviors of searching foods is a population-based intelligence strategy. It converges fast and it is easy to implement. Although some **particle swarm optimization** algorithms are successfully used in many engineering optimization areas (Liang *et al*., 2006; Parsopoulos and Vrahatis, 2004; Kadirkamanathan *et al*., 2006), a few of them solve inverse problems effectively. In this study, a hybrid PSO algorithm is proposed to successfully identify parameter of elliptic differential equation.

**DESCRIPTION OF INVERSE PROBLEM**

The following elliptic problem will be concerned in this study:

(1) |

Here, Ω can be any bounded domain in R with piece wise smooth boundary Γ. The objective of a parameter identification procedure is to choose a parameter q* such that the solution u associated with q* well matches the measured state. In general, the measured data u may be corrupted by measurement errors. The data with noise level δ is denoted by u^{δ}. In practice, we may only measure the state u within a certain time period T for the time-dependent model. And the sampling data u is discrete point. The time interval T = [0,1] is divided equally into n parts, the step size h = 1/n and mesh point x_{i} = ih (i = 0,1,2,…,n). Suppose two boundary points’ values u_{0} and u_{n} are given, the observed values of u^{δ} at each mesh point x_{i} (i = 0,1,2,…,n-1) can be denoted as u^{δ} = (u_{1},u_{2}…,u_{n-1}). f(x) and u are known, the inverse problem is to identify the parameter q.

Co-author Wu proposes using Hat function φ_{0}(x), φ_{1}(x),…, φ_{n}(x) to discretize the parameter q(x). The identified parameter q(x) can be expressed as:

(2) |

Where:

The Eq. 1 can be transformed into discretization form by using difference method and hat functions, as Eq. 3. q(x) can be obtained by solving Eq. 3:

(3) |

To transform the ill-posed parameter identification problem into an optimization problem, we consider the regularization method to evaluate the quality of solution. Regularization function which decides the evolution direction is as follows:

(4) |

where, u_{q} is the solution by solving differential Eq. 1, β is a regularization parameter. N(q) is a regularization term which is determined by the continuality of the parameter q and the penalty function N(q) we designed is:

(5) |

**THE HYBRID PARTICLE SWARM OPTIMIZATION**

PSO is a population based algorithm introduced by Kennedy and Eberhart (1995). In PSO, a particle represents a potential solution which is a point in the search space. Each particle adjusts its flying direction according to its own experience and the other particles’ experience of the swarm. Each particle in swarm has its own velocity and position in different time. The velocity and position updating formulas are in Eq. 6 and 7, respectively.

(6) |

(7) |

where v_{t} and x_{t} are the velocity and position of the ith particle at time t, respectively. pbest_{t} is the best previous position that the ith particle has ever visited; gbest_{t} is the best position discovered by the whole particles. c_{1} and c_{2} are the cognitive parameter and social parameter, respectively, both are positive number. r_{1} and r_{2 }are random number in the range [0,1]. The velocity v_{t} can not beyond the constant v_{max}. A large inertia weight is more appropriate for global search and a small inertia weight facilitates local search. Inertia weight ω is used to balance the global and local search abilities. ω is distributed in the range [0,1]. The velocity Eq. 6 has three elements: current velocity of particle, recognition ability which presents itself experience of the particle and social part which shows the ability of information sharing in swarm.

**The effect of velocity limit:** From Eq. 6, we find x_{t+1} is limited in the circle which is with a center of x_{t} and a radius of v_{max}, i.e., x_{t+1} ∈∪(x_{t}, |v_{t+1}|)⊆(x_{t}, v_{max}). Velocity limit v_{max} decreases the global search ability and increases the early convergence possibility, because the position vector’s change is restricted by v_{max}.

Equation 6 can be simplified as:

(8) |

where, φ_{1} = c_{1}r_{1}, φ_{1} = c_{2}r_{2}, φ = φ_{1}+φ_{2} and u = φ_{1} pbest_{t}+φ_{2} gbest_{t}. Equation 8 is the second order linear equation. The following theorem 1 is the existence and uniqueness theorem for the second order linear equation.

**Theorem 1 (Robinson, 2004):** Given a function f(x_{2}, x_{1}, t) suppose that f, ∂f/∂x_{1} and ∂f/∂x_{2} are continuous functions for a_{1}<x_{1}<a_{2}, b_{1}<x_{2}<b_{2} and t_{1}<t<t_{2}. Then for all initial conditions:

(9) |

(10) |

on some interval I containing to, i.e., a continuous function with two continues derivatives that satisfy Eq. 9 and 10 on I.

The particle velocity is v_{t} = dx/dt. In the initialization procedure, each particle is assigned with a random velocity and position in the solution space [x_{min}, x_{max}]. Suppose x(t_{0}) = x_{0}, v(t_{0}) = dx/dt = v_{0} and x_{0}, v_{0} ∈ [x_{min}, x_{max}]. From Eq. 8, we can get:

Because ∂f/∂v and ∂f/∂x are continuous in [x_{min}, x_{max}], the theorem is deduced from the above preliminary discussion.

**Theorem 2:** For particle system, Eq. 6 has a unique solution without the velocity limit v_{max}.

According to theorem 2, we can cancel the velocity limit v_{max}, so the particle can search the whole space Ω, i.e., x_{t+1} ∈ Ω.

**A new velocity updating equation:** In this section, a new velocity updating equation is proposed based on the analysis of the particle’s trajectory.

Equation 6 and 7 can be transformed into:

(11) |

(12) |

Let y_{t} = u/φ-x_{1}, Eq. 12 and 13 can be described in matrix form:

(13) |

Where:

Suppose we know the particle’s initial state H_{0}, Eq. 14 can be deduced into Eq. 15 by iteration method.

(14) |

If matrix B is diagonalizable, there exists a matrix Y_{2x2} making the following equation true.

(15) |

where,

are the distinct eigenvalues.

Let S_{t} = Yh_{t} Eq. 14 and 15 can be changed to:

(16) |

(17) |

**Definition 1:** Let A = {x_{k}} and B = {y_{k}} be sequence with limit ς and ξ, respectively. If

that is said that B converges faster than A.

**Theorem 3:** The PSO algorithm converges faster when the eigenvalue ||e_{t}||→0.

**Proof:** Suppose that and S^{t} = L^{t}·S_{0}. So

if and only if ||e_{i}'/e_{i}||<1. So, the PSO algorithm converges faster when the eigenvalue ||e_{i}||→0.

ω is more near 0, the PSO algorithm gets higher convergence rate. Based on the above discussion, a new velocity updating equation is proposed.

(18) |

At the beginning of the run, the large inertia weight makes the particle’s current velocity have major impact on the velocity updating. But it makes the global best position gbest_{t} have minor impact on the velocity updating. So, the algorithm has stronger global search ability. With the decreasing of inertia weight, the algorithm converges faster. At the end of the run, the small inertia weight makes the social part have a great effect on the swarm search behavior.

**The scheme of multi-parents crossovers:** In the late optimization procedure, the particles converge faster with the decreasing inertia weight. It is proved that the optimization procedure fall in the stagnation situation when the particles congregate the point p* = (φ_{1}pbest+φ_{2}gbestb)/φ. Multi-parents crossover operator is adopted to break the stagnant situation.

The multi-parents crossover operator is described as follows:

X' is the new particle generated by several parents. The multi-parents crossover operator plays an essential role in keeping diversity of population and makes the other particles’ merits inherited by the new particle as well.

**The structure of the hybrid PSO:** Each particle presents a potential solution q(x). The structure of the hybrid PSO is described as follows:

**EXPERIMENT RESULTS**

Here, numerical experiments of two different elliptic parameter identification problems are reported. In test 1, the parameter q(x) to be identified is pure polynomial function. In test 2, the parameter q(x) is trigonometric functions. In the experiments, the swarm size and the maximum iteration number are set to 30 and 5000, respectively, the cognitive parameter c_{1} and social parameter c_{2} are set to 2 and the inertia weight ω deceases linearly from 0.9 to 0.4. Because crossover probability p_{c} varies with the noise level δ, the concrete value p_{c} is shown in Table 1. The regularization parameter β is different for different problems.

The observed data u(x) has the following form:

u ^{δ} = (1+δ · rand()u(x)) |

where, rand() is a function which generates a stochastic number distributed in [-1,1] and ä is the noise level parameter.

**Test 1:**

where, u(x) = sin(πx), q(x) = 4x^{2}+x+2.

For this experiment, β is 1.0x10^{-7}. The numerical results are shown in Fig. 1-3. Figure 1 is the experiment result with noise level 1%, Fig. 2 is the result with noise level 5% and Fig. 3 is the result with noise level 8% for test 1. When the noise level is 1%, the identified solution well fit the exact solution as in Fig. 1. When the noise level is 5%, the identified solution only has a minimal deviation from the exact solution as in Fig. 2. When the noise level is 8%, the identified solution still better fit the exact solution as in Fig. 3.

**Test 2:**

where u(x) = sin(πx), q(x) = 6x cos(πx)+6.

Table 1: | The value of crossover probability p_{c} |

Fig. 1: | The comparison for test 1 with δ = 1% |

Fig. 2: | The comparison for test 1 with δ = 5% |

Fig. 3: | The comparison for test 1 with δ = 8% |

For this experiment, β is 1.0x10^{-6}. The numerical results are shown in Fig. 4-6. Figure 4 is the experiment result with noise level 1%, Fig. 5 is the result with noise level 5 and Fig. 6 is the result with noise level 8% for test 2.

Fig. 4: | The comparison for test 3 with δ = 1% |

Fig. 5: | The comparison for test 3 with δ = 5% |

Fig. 6: | The comparison for test 3 with δ = 8% |

From the figures above, the identified solutions well fit the curves of the function q(*x*). There are nearly no difference between the identified solution and the exact solution when the noise level of the observed data is 1% as in Fig. 4. Figure 5, the identified solutions have only a very small deviation from the exact solution when the noise level is 5%. Even the noise level increases to 8%, the experiment results are still satisfactory, as in Fig. 6.

As far, Wu *et al*. (2004) has done the same experiments on discrete parameter identification in elliptic system by applying an evolutionary algorithm with multiple parents’ crossover operator. In the experiments Wu *et al*. (2004), the proposed algorithm yields good performance on elliptic parameter identification problems and is not very sensitive to noise. This study presents a hybrid PSO algorithm without velocity limit to solve the same problems. The particles without speed limit can expand the search scope and the algorithm enhances the capacity of the global search. A new velocity updating equation is proposed to accelerate the convergence speed of the swarm. When the evolution is deadlocked, new promising particles are generated by multi-parents crossover operator. The numerical experiment results show the hybrid PSO is effective to identify the continuous parameter in elliptic problem. Because of the ill-posed nature of inverse problem, the noise of observed data probably leads to large deviation of the solutions. It is fortunate that the algorithm proposed in the paper is not sensitive to the noise even the noise level is up to 8%. We are going to study the non-continuous parameters identification of inverse problem in the near future.

- Ben-yu, G. and J. Zou, 2001. An augmented lagrangian method for parameter identifications in parabolic systems. J. Mathematical Anal. Appl., 263: 49-68.

CrossRef - Huang, J.G. and J. Zou, 2007. Uniform a priori estimates for elliptic and static maxwell interface. Discrete Continuous Dyn. Syst., 7: 145-170.

Direct Link - Li, J.Z. and J. Zou, 2007. A multi-level model correction method for parameter identification. Inverse Problems, 23: 1759-1786.

CrossRefDirect Link - Liu, H. and J. Zou, 2007. Zeros of the bessel and spherical bessel functions and their applications for uniqueness in inverse acoustic obstacle scattering. IMA J. Applied Mathematics, 72: 817-831.

CrossRefDirect Link - Jin, B. and J. Zou, 2008. Inversion of robin coefficient by a spectral stochastic finite element approach. J. Comput. Phys., 227: 3283-3306.

CrossRefDirect Link - Wu, Z., Z. Tang, J. Zou, L. Kang and M. Li, 2004. An evolutionary algorithm for solving parameter identification problems in elliptic systems. Proceedings of the IEEE Conference on Evolutionary Computation, June 19-23, Portland OR, USA., pp: 803-808.

Direct Link - Liang, J.J., A.K. Qin, P.N. Suganthan and S. Baskar, 2006. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Tans. Evol. Comput., 10: 281-295.

CrossRefDirect Link - Parsopoulos, K.E. and M.N. Vrahatis, 2004. On the computation of all global minimizers through particle swam optimization. IEEE Trans. Evol. Comput., 8: 211-224.

CrossRefDirect Link - Kadirkamanathan, V., K. Selvarajah and P.J. Fleming, 2006. Stability analysis of the particle dynamics in particle swarm optimzer. IEEE Tans. Evol. Comput., 10: 245-255.

Direct Link - Kennedy, J. and R. Eberhart, 1995. Particle swarm optimization. Proc. IEEE Int. Conf. Neural Networks, 4: 1942-1948.

CrossRefDirect Link