ABSTRACT
In micromanipulation, the microscope vision servoing can achieve a high performance. In order to avoid the complicated calibration of intrinsic parameter of camera, We apply an improved broyden`s method to estimate the image jacobian matrix on line, which employs Chebyshev polynomial to construct a cost function to approximate the optimization value, obtaining a fast convergence for online estimation. Using the estimated jacobian matrix, a PD visual controller is used to make features converge to desired values with satisfactory dynamic performance. The experiments of micro-assembly of micro parts in microscopes confirm that the proposed method is effective and feasible.
PDF Abstract XML References Citation
How to cite this article
DOI: 10.3923/itj.2008.497.503
URL: https://scialert.net/abstract/?doi=itj.2008.497.503
INTRODUCTION
Micromanipulation robotic has been applied widely in micro electromechanical system assembly. In order to meet the request of the high precise micro-manipulationb task, The robotic with the micro vision help becomes a very necessary part. Nowadays, the vision servoing control methods can be divided into three parts, which include vision servoing control based position vision servoing control based image and mix control based position and image. The methods mentioned in above need calibrate precisely intrinsic parameter of camera. However, the system calibration is the complicated and difficult problem, especially for micro-manipulation based on microscope vision. So, we employ the uncalibrated method to estimate image jacobian matrix online.
Kang et al. (2006), Hutchinson et al. (1996), Lu et al. (1996), Zhao et al. (2006), Malik and Choi (2007), Hosada and Asada (1994) and Shen et al. (2003) have been reported some researches about image jacobian matrix online estimation. Piepmeier et al. (1999) and Piepmeier et al. (2004) presents a moving target tracking task based on the quasi-Newton optimization method. This approach is adaptive, but cannot guarantee the stability of the visual servoing. Maliss (2004) method can keep the parameter of vision servo controller constant, when intrinsic parameter of camera is changed. Liu et al. (2006) applies the depth independent matrix in uncalibrated vision, with Slotine and Li (1987) method to estimate online. Chen and Dawson (2003) presents a adaptive tracking controller based on homography. Su and Yugeng (2000) and Su et al. (2004) presents a motion 3D object tracking method based on the uncalibrated global vision feedback.
Unfortunately, the current estimation methods have problems such as estimation-lag, singularity, convergence and its speed. Especially in dynamic circumstances, these problems become more serious.
To deal with those problems discussed above, We apply an broydens method to estimate the image jacobian matrix. the method employs Chebyshev polynomial to construct a cost function to approximate the optimization value, improving the converge of estimation. Present results show that, when calibration information is unavailable or highly uncertain, Chebyshev polynomial algorithm can achieve a satisfactory result, which can bring additional performance and flexibility for the control of complex robotic systems. To verify the effectiveness of the method, Using the estimated jacobian matrix, a PD visual controller is used to make features converge to desired values with satisfactory dynamic performance. The experiments of micro-assembly of micro parts in microscopes confirm that the proposed method is effective and feasible.
SYSTEM CONSTRUCTION OF MICROMANIPULATION
The micromanipulation system consists of micromanipulation stage, microscopes vision, micro-gripper. The system construction is showed in Fig 1.
![]() | |
Fig 1: | The system construction of three hands cooperation micromanipulation stage |
Micromanipulation hands: The left and right hands consist of 3D high precise micro-motion stage driven by AC servoing motor and one DOF pose adjust joint driven by DC servoing motor. The motion range of 3D micro-motion stage is 50x50x50 mm and precise of position is 2.5 μm. The rotary range of pose adjust joint motor is ±180° and the resolution is 0.01. The third hand consists of three DOF motor driven by DC servoing motor and the operation range is 20x20x20 mm.
Microscope vision: Microscope vision is a main method that micromanipulation robotic obtains environment information. The vision system consists of vertical crossed two rays, which can monitor micro-assembly space by stereo method and obtain the space and pose information of object and end-effector, providing control and decision-making information for robotic.
End-effector: There are two driven types micro-gripper developed by us. One is micro-gripper driven by vacuum and another is driven by piezoelectricity ceramic, which can operate the micro parts with different size, shape and material.
ONLINE ESTIMATION IAMGE JACOBIAN MATRIX
Image jacobian: The image jacobian defines the relationship between the velocity of a robot end-effector and the change of an image feature. Considering q = [q1, q2 qm]R represents the coordinates of robot end-effector in the task space. A n-dimensional vector: f = [f1, f2 fn]T is corresponding position in image feature. Then, the image jacobian matrix Jq is defined as:
![]() | (1) |
Where:
![]() | (2) |
Image jacobian matrix estimation based on broydens method: The image jacobian matrix can be calculated by calibrating the inner and outer parameter of robotic system and sensor system. However, it is impossible to obtain precise system parameter under a dynamic or uncertainty environment. Considering those, we employ broydens method to estimate the image jacobian matrix.
The broyden method that solves the nonlinear equation can be shown in Eq. 3:
![]() | (3) |
Where:
![]() |
Now, we apply the broyden method to construct estimation model of image jacobian matrix. According to Eq. 1, If the feature error of two images represents as:
![]() | (4) |
where, fd is the feature of the expectation image and fc is the feature of the current image. The taylor series expansion of ef is shown as:
![]() | (5) |
where, Rn(x) is Lagrange remaining. We define J*q(qn) as the Nth image jacobian to be estimated,
![]() | (6) |
Ignoring the high order term and Lagrange remaining Rn(x), Eq. 7 can be obtained from (5) and (6), which is shown as:
![]() | (7) |
So, if Js counterpoint is AΔes counterpoint is y and Δqs counterpoint is s, we can construct the image jacobian estimation model based broyden. The image jacobian estimation model based on broyden method is shown as Eq. 8:
![]() | (8) |
The broyden algorithm estimates the optimization value by employing iterative computation. Therefore, it need the end condition of iterative computation, we employ Chebyshev polynomial to construct the cost function to approximate the optimization value.
The cost function based on Chebyshev polynomial: Provided that
![]() | (9) |
If Nk(q) ε c[-1,1], For Chebyshev polynomial serial {Tn, n = 0, 1,...} with weight ρ(x) = (1-x2)-1/2, its optimization square approximation polynomial can be shown as:
![]() | (10) |
Where:
![]() | (11) |
Then
![]() | (12) |
Usually, Nk(q) ε c[a, b] we must convert c[a,b] into c[-1,1]. The Eq. 13 can finish the convert.
![]() | (13) |
if we use part sum s*n as N(q)s approximation, under some conditions, there is a fast speed for an→ 0. Theoretically, compared with RLS algorithm, Chebyshev polynomial approximation algorithm is independent of the prior knowledge of system and it has more fast approximate speed than that of other methods. Experiments will prove its correctness.
Surely, The unsatisfied thing of Chebyshev polynomial approximation algorithm, we encountered, lies in that it require N(q)s good smoothness. It is a difficulty for us to meet this need for most conditions.
Comparison Chebyshev polynomial approximation with RLS: Piepmeier et al. (1999, 2004) provides RLS algorithm to approximate best value for minimum cost function. The cost function using RLS is shown as 14.
![]() | (14) |
where, λ is a rate of dependency for prior data. As shown in Eq. 14, In order to obtain some performance, the cost function using RLS algorithm depends on the data of the several past steps, it mean that the prior knowledge must be obtained for finishing the task.
Similarly, The cost function using Chebyshev polynomial is shown as 15.
![]() | (15) |
![]() | |
Fig 2: | A broyden with Chebyshev polynomial approximation estimator of image jacobian |
Clearly, The cost function using Chebyshev polynomial is independent of the prior data.
Jacobian estimator with improved Broydens method: As discussed earlier, an broyden with Chebyshev polynomial approximate algorithm estimator of image jacobian is developed. A graphical representation of the estimate process is shown in Fig 2. Firstly, The broyden estimator starts with initial endeffector position q0 and precision ε. Then, Camera captures an image of endeffector for extracting corresponding image coordinate feature fk, Which provides the possibility for calculating J*(qk) by formula J*(qk) = [f′(qk)]1. Secondly, Camera captures an image of target to obtain expectative image coordinate feature fk+1. With the obtained J*(qk), the servoing control law can be deduced detail in section of the vision controller design. Finally, Program judges whether precision ε satisfies system requirement or not. If precision ε arrives the requirement, system will be ended, otherwise system will be executed repeat processing.
THE VISION CONTROLLER DESIGN
In order to finish three-dimensional small object positioning task, in the actual operation, micro-manipulation tasks will be divided into horizontal direction (XY plane) movement and the vertical direction (Z axis) movement. The manipulator in the XY plane moves first, positioning small parts in the above, then does so in the Z-axis vertical movement, positioning small parts at the centre. Therefore, we apply two image jacobian matrixs, including horizontial view field of image jacobian matrix and vertical view field of image jacobian matrix, which can complete the positioning and tracking three-dimensional objects.
![]() | |
Fig.3: | Micromanipulator servo control structure |
The change of robot movement [dx,dy]T and the change of image characteristics [du,dv]T can be wirte as (16):
![]() | (16) |
According to the online estimtion image Jacobian matrix J based on broyden method, sets the position of the error e = fd-fc, which fd is the expectations of position of objects (small cylindrical parts, 600 um diameter) and fc is the centre of endeffector. Then, the control law of PD controller u (k) is:
![]() | (17) |
where, Ts is the time interval, Kp is proportional gain and Kd is differential gain. Its control structure is shown in Fig. 3:
RESULTS
A robotic micro-assembly system has been developed in our lab. Figure 4 shows is three hands coordination micro-manipulation system. The system consists of two 4 degree-of-freedom (DOF) master micromanipulators and one 3 DOF slave micromanipulator. The master micromanipulator is provided with three translational DOF and one rotational DOF. It has a moving travel of 50 mm and a repeatability of 2 μm along each of X, Y and Z axes. Stereo microscopic visual feedback is necessary in precise 3D microassembly. Compared with usual stereo vision configuration using parallel views, a microscopic vision unit with two perpendicular views is developed to reduce the structural complexity of mechanism. Both vertical and horizontal microscopic views provide a 3D microassembly scene and avoid complicated depth estimation in many micromanipulation.
To accomplish micromanipulator positioning and girpping small parts, we must first obtain the centre of object and the centre of the end of endeffector.
![]() | |
Fig. 4: | The experimental system of micromanipulation |
![]() | |
Fig. 5: | The original microscopic image of object and the endeffector in vertical (a) and horizontal (b) view fields |
The centre of object and the end of endeffector can be accessed by a series of image processing (gray, de-noising and filter, canny operator, edge extraction, fuzzy c-means clustering). Figure 5 shows the original microscopic image of two view fields, the object (cylindrical parts) and the endeffector (clip shape objects). Figure 6 shows the object centre image and the end centre of the endeffector after processing. In Fig. 6, the XY image plane coordinates of the center of the object is (147,99) and the centre of the end of the endeffector is (343,77). (Image the size of 440x330 pixel).
Assuming that the initial parameters of PD controller Kp is 10 and Kd is 0, that is, only joined proportional control, control effect is shown in Fig. 7 (Note: proportional gain limited to 10 because of the very small micro movement and the requirement of real-time, the proportional gain too much may cause system oscillation), we can see the implementation of automatic positioning objects to the target center, a greater oscillation and overshoot.
![]() | |
Fig. 6: | The object centre image and the end centre of the endeffector after processing in vertical (a) and horizontal (b) view fields |
![]() | |
Fig. 7: | The trajectories of micromanipulator approaching goal objects with only proportional control (XY plane) |
![]() | |
Fig. 8: | The trajectories of micromanipulator approaching goal objects with proportional and differential control (XY plane) |
![]() | |
Fig. 9: | The image of endeffector automatically locating and gripping object in vertical (a) and horizontal (b) view fields |
![]() | |
Fig 10: | Convergence speed of Chebyshev algorithm (up) and Convergence speed of RLS (down) |
When Kp is 10 and Kd is 1.5, which incorporates proportional and differential control, control result is shown in Fig. 8. Differential joined inhibits apparently the system overshoot and the system meets the rapid and smooth. Finally, the implementation of micro-manipulator positioning and automatic gripping operations is given, It can be obtained the satisfied implementation with the results to the system application requirements. Fig 9 shows the image of the endeffector automatically locating and gripping object.
Finally, we give the comparisons of the convergence speed of the cost function based on Chebyshev polynomials and RLS, as shown in Fig 10. Clearly, it can be seen from Fig 10, compared with the RLS algorithm, the cost function based on Chebyshev polynomials in the two step time can be further convergence and improve the system identification process of convergence.
CONCLUSION
For the completion of three-dimensional micro-sized components assembly, We apply an improved broydens method to estimate the image jacobian matrix on line, which employs Chebyshev polynomial to construct a cost function to approximate the optimization value. Identification by the Jacobian matrix image, design a PD controller to control micro-robot. Finally in the microscopic visual environment, completes the visual servo task of micromanipulator positioning and automatic gripping small parts, the experimental results show that the algorithm is relatively satisfied with the effectiveness of implementation and the PD controller design achieves rapid and the smoothness of the request. At the present stage, proportional gain and differential gain is given by the experience, the systems effectiveness is not yet ideal, the follow work will be resolved the PD parameter with adaptive adjust.
REFERENCES
- Chen, J. and D.M. Dawson, 2003. Adaptive homography-based visual servo tracking. IEEE Int. Conf. Intel. Robot. Syst., 1: 230-235.
CrossRef - Hosada, K. and M. Asada, 1994. Versatile visual servoing without knowledge of true Jacobian. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, September 12-16, 1994, Munich, Germany, pp: 186-191.
CrossRefDirect Link - Hutchinson, S., G.D. Hager and P.I. Corke, 1996. A tutorial on visual servo control. IEEE Trans. Robot. Automat., 12: 651-670.
CrossRefDirect Link - Kang, Q.S., T. Hao, Z.D. Meng and X.Z. Dai, 2006. Pseudo inverse estimation of image jacobian matrix in uncalibration visual servoing. Proceedings of the International Conference on Mechatronics and Automation, June 25-28, 2006, Luoyang, Henan, pp: 1515-1520.
CrossRef - Liu, Y.H., H.S. Wang, C.Y. Wang and K.L. Kin, 2006. Uncalibrated visual servoing of robots using a depth-independent interaction matrix. IEEE Trans. Robot., 22: 804-817.
CrossRef - Lu, C.P., E. Mjolsness and G.D. Hager, 1996. Online computation of exterior orientation with application to hand-eye calibration. Math. Comput. Model., 24: 121-143.
CrossRef - Malik, A.S. and T.S. Choi, 2007. Consideration of illumination effects and optimization of window size for accurate calculation of depth map for 3D shape recovery. Pattern Recognit., 40: 154-170.
CrossRefDirect Link - Malis, E., 2004. Visual servoing invariant to changes in camera-intrisic parameters. IEEE Trans. Robot. Autom., 20: 72-81.
CrossRef - Piepmeier, J.A., G.V. MacMurray and H. Lipkin, 1999. A dynamic quasi-newton method for uncalibrated visual servoing. Proceedings of the International Conference on Robotics and Automation, May 10-15, 1999, Detroit, Michigan, pp: 1595-1600.
CrossRef - Piepmeier, J.A., G.V. MacMurray and H. Lipkin, 2004. Uncalibrated dynamic visual servoing. IEEE Trans. Robot. Autom., 20: 143-147.
CrossRefDirect Link - Shen, S.D., Y.H. Liu and K. Li, 2003. Asymptotic trajectory tracking of manipulators using uncalibrated visual feedback. IEEE/ASME Trans. Mechatron., 8: 87-98.
CrossRefDirect Link - Slotine, J.J. and W. Li, 1987. On the adaptive control of robot manipulators. Int. J. Robot. Res., 6: 49-59.
CrossRef - Su, J., H. Ma, W. Qiu and Y. Xi, 2004. Task-independent robotic uncalibrated hand-eye coordination based on the extended state observer. IEEE Trans. Syst. Man Cybernet., 34: 1917-1922.
CrossRefPubMedDirect Link