Subscribe Now Subscribe Today
Research Article
 

Study of Key Pose of Movement Similarity on Humanoid Robot



Ke Wen-De, H. Bing-rong, C. Gang, P. Zhi-ping, Y. Quan-de and H. Liang
 
Facebook Twitter Digg Reddit Linkedin StumbleUpon E-mail
ABSTRACT

The method of similarity movement has attracted more attention for it is better than the traditional mathematics analysis in energy optimization and smooth joint track when designing the complicated joint tracks for humanoid robot. The extended architecture of obtaining and applying similar tracks, the similarity degree and synchronous control and Multi-level key poses were studied. Firstly, the extended architecture of obtaining and applying similar tracks was proposed, in which there were 9 modules movement capturing, feature extraction, applying similar tracks, kinematics constraint, dynamics constraint, simulating experiment, practical experiment, standard database and correction for environments. Secondly, the definition of similarity degree was shown from the aspects of differences in angles, angular velocities and angular accelerations between human being and humanoid robot. The movement synchronization was controlled through the angular velocity ratio, its rate of change, the angular acceleration ratio and its rate of change. Thirdly, the converting relationships of key poses were presented. For multi level key poses, the rules of priority level from the highest level to the lowest one and the corresponding dendrogram from bottom to top were constructed. The method of how to calculate the period of time for the γth level was also given. In experiment, the humanoid robot Aldebaran Nao was used to imitate the dance motion of human being. Each dancing pose of human being was divided into 4 level key poses (Finger, Wrist, Elbow, Shoulder) for the upper limbs and (Finger, Wrist, Elbow, Shoulder) for the lower limbs. The multi level key poses acted simultaneously to realize the synchronization of similar movement for humanoid robot. The comparison of corresponding angles such as shoulders, elbows, hips, knees and ankles between human being and humanoid robot showed the effect of similarity movement.

Services
Related Articles in ASCI
Search in Google Scholar
View Citation
Report Citation

 
  How to cite this article:

Ke Wen-De, H. Bing-rong, C. Gang, P. Zhi-ping, Y. Quan-de and H. Liang, 2012. Study of Key Pose of Movement Similarity on Humanoid Robot. Information Technology Journal, 11: 1612-1618.

DOI: 10.3923/itj.2012.1612.1618

URL: https://scialert.net/abstract/?doi=itj.2012.1612.1618
 
Received: January 26, 2012; Accepted: June 11, 2012; Published: August 28, 2012



INTRODUCTION

Compared to the traditional methods of mathematics analytic equations in designing the joint tracks for humanoid robot (Chen, 2009; Bo et al., 2010; Soltanpour et al., 2008; Naeimi et al., 2009; Arbaoui et al., 2006; Motlagh, 2011; Zhixiang et al., 2011), the new method based on similarity movement of human being has more advantages in energy optimization, smooth joint tracks, design of complicated movement, etc (Moldenhauer et al., 2005). For instances, Huang et al. (2010) set constraints of kinematics and dynamics for the acquired movement tracks and designed sword performance for humanoid robot. Ruchanurucks et al. (2011) categorized the movements of human being to form the corresponding symbols, in which the complicated movements were created through different combinations of symbols. Hapsari and Prabuwono (2010) reviewed the human motion analysis, the methodologies of human motion surveillance, choice of system and how they worked. Xiao-Jun et al. (2005) applied the system of acquiring similarity movement parameters of human being to match Taiji performance for humanoid robot. Takano et al. (2007) extracted the human being’s basic movements and constructed the symbol database. Kim et al. (2006) realized the similar movement of upper limbs of humanoid robot through hybrid designing method. Liu et al. (2011a, b) and Wang et al. (2009) studied the motion imitation interaction through the human motion data acquisition modification and ankle angle adjustment. Ramos et al. (2011) proposed the methodology to quickly reshape a dynamic motion demonstrated by a human and to apply the dynamics of the human to the dynamics of the robot. Albrecht et al. (2011) proposed an end-to-end framework which equipped robots with the capability to perform reaching motions in a natural human-like fashion. Boutin et al. (2010) proposed the method of generating the trajectories for humanoid robots from the imitation of human gaits captured with a motion capture system. Do et al. (2009) proposed a system for vision-based grasp recognition, mapping and execution on a humanoid robot to provide an intuitive and natural communication channel between humans and humanoids.

The above studies mainly focused on how to capture the similarity tracks and how to set the kinematics and dynamics controls for humanoid robot. However, this research mainly studies the similarity degree and synchronous control.

THE EXTENDED ARCHITECTURE OF OBTAINING AND APPLYING SIMILAR TRACKS

The extended architecture of obtaining and applying similar tracks is shown in Fig. 1, in which, (1) the movement capturing module acquires the multi-joint moving tracks of human being and saves to the image database, (2) the feature extraction module is used to eliminate the distortion and noise of images such as salt noise pepper noise which can be dealt by median filtering, (3) the module of applying similar tracks applies the extracted tracks with the time measure to the joints of humanoid robot, (4) the kinematics module is used to set the movement constraints for the humanoid robot to meet the kinetics requirements, (5) the dynamics module is used to count the motion energy consumption of humanoid robot for further optimization, (6) the simulating module is used to show the simulating effect of similarity movement, (7) the simulating result is proved through the practical experiments and is fed back to revise the angle, angular velocity, angular acceleration, synchronization and stability of humanoid robot offline, (8) the optimized movements are stored into the standard database when humanoid robot independently carries out actions without the similarity environment, (9) the revised module for different environments corrects the standard movement to meet the requirement of different environments.

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 1: The extended architecture of obtaining and applying similar tracks

SIMILARITY DEGREE AND SYNCHRONOUS CONTROL

Definition of similarity degree: pW = {ΦW(0), ΦW(t), ..., ΦW(T)} represents all key poses extracted from the movement processing of human being in the time interval (0, T), among which W = {H, R}, H represents the human being and R represents the humanoid robot. ΦW(t) = {Φ1W(t), ..., ΦiW(t), ..., ΦNW(t)}T represents angles of all joints of W at time t, among which t_,(0,T) and N represents the joint amount. The similarity degree is defined as:

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
(1)

among which Image for - Study of Key Pose of Movement Similarity on Humanoid Roboth and Image for - Study of Key Pose of Movement Similarity on Humanoid Robot represent the maximum value and minimum one of angle range of the ith joint, respectively. The formula (1) represents the ith joint similarity degree between the humanoid robot and human being, τ is the coefficient that meets 0≤τ1, τ2, τ3≤1 and τ123 = 1. The movement tends to be motionless when τ tends to 1 and vice versa.

Description of movement synchronization: The synchronization shows the movement consistency of related joints of humanoid robot from the initial states to the terminal ones. In order to guarantee the movement synchronization between the ith joint and the jth one in the time interval (t1, t2), the relationship between nRt and nRj shall be analyzed. The angular velocity ratio and its rate of change at time t are:

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
(2)

among which, (1) k1>1 means that the angular velocity of the ith joint is bigger than that of the jth one, k΄1>0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is bigger than that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (2) k1 = 1 means that the angular velocity of the ith joint is equal to that of the jth one and k΄1 = 0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is same to that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (3) k1<1 means that the angular velocity of the ith joint is smaller than that of the jth one, k΄1<0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is smaller than that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot.

Similarly, the angular acceleration ratio and its rate of change at time t are:

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
(3)

among which, (1) k2>1 means that the angular acceleration of the ith joint is bigger than that of the jth one, k΄2>0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is bigger than that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (2) k2 = 1 means that the angular acceleration of the ith joint is equal to that of the jth one and k΄2 = 0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is same to that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (3) k2<1 means that the angular acceleration of the ith joint is smaller than that of the jth one, k΄2<0 means the growth rate of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot is smaller than that of Image for - Study of Key Pose of Movement Similarity on Humanoid Robot. Specially, when the joint i and j are same to each other, thus (1) k2>1 means Fri (t)>FRj (t) for driving force and k΄2>0 means Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (2) k2 = 1 means Fri (t) = Frj (t) and k΄2 = 0 means Image for - Study of Key Pose of Movement Similarity on Humanoid Robot; (3) k2<1 means Fri (t)<Frj (t) and k΄2<0 means Image for - Study of Key Pose of Movement Similarity on Humanoid Robot. From the above analysis, the movement synchronization can be guaranteed through controlling the angular acceleration ratio or driving force ratio between joints.

MULTI-LEVEL KEY POSES

The pose of human body is regarded as the key pose when it pauses at one time (Albrecht et al., 2011). The states of key poses (pausing, constant velocity, constant acceleration, changing velocity, etc.) can be converted between them through Table 1.

Table 1: Transformation of key postures
Image for - Study of Key Pose of Movement Similarity on Humanoid Robot

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 2: Dendrogram of key poses

For angular velocities of all joints of humanoid robot, Image for - Study of Key Pose of Movement Similarity on Humanoid Robot means that the current pose is the Level One key pose, among which i∈(1, N).

The multi-level key poses are set through the following rules of priority level from the highest level to the lowest one:

Rule 1: Finger→wrist→elbow→shoulder
Rule 2: Toe→ankle→knee→hip
Rule 3: Neck→waist

So the dendrogram of key poses from bottom to top can be constructed. Figure 2 shows the dendrogram of left hand, among which L-left, R-right, Sh-shoulder, El-elbow, Wr-wrist, Th-thumb, Fir-first finger, Mi-middle finger, Rin-ring finger, Lit-little finger.

Thus the period of time for the key pose level one is T1 = t1_2-t1_1 and:

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
(4)

where, L represents the maximum level, Mγ represents the maximum node in the γth level, tγ_2 and tγ-1 represent the earliest time and the latest one of all key poses in the γth level, tγ-η-2 and tγ-η-1 represent the earliest time and the latest one of the ηth key poses in the γth level, respectively. Thus, Tγ = tγ-2-tγ-1 means the period of time for the γth level and:

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
(5)

EXPERIMENTS

The synchronous movements of key poses for humanoid robot were realized through the dance actions. The experiments were based on the humanoid robot Aldebaran Nao. For the upper limb, there were key poses as {L_Finger, L_Wrist, L_Elbow, L_Shoulder} for the left side and {R_Finger, R_Wrist, R_Elbow, R_Shoulder} for the right side. For the lower limb, there were key poses as {L_Toe, L_Ankle, L_Knee, L_Hip} for the left side and {R_Toe, R_Ankle, R_Knee, R_Hip} for the right side.

Figure 3 showed 6 level one key poses of dance that each limb of human body converted from the 1st pose to the 6th one in the period of time 13 sec.

Every level one pose could be divided into 4 level key poses, in which the level of finger was corresponding to that of toe, so as wrist and ankle, elbow and knee, shoulder and hip. The sampling period was 100 m sec. All levels acted simultaneously. The comparisons of angle tracks of dance between the right side of human being and that of humanoid robot were shown in Fig. 4. The comparisons of left side were similar to that of right side. The tracks of humanoid robot matched that of human body at a large portion even the electronic and mechanic features of humanoid robot existed. The simulating and practical effects were shown in Fig. 5 and 6. The experiment showed that when the pose was divided into the smallest level the synchronization of similarity movement could be realized through the synchronous control of joint angles. We had realized the falling forward of similarity movement for humanoid robot (Wen-De et al., 2006, 2010), in which there were 3 Level One key poses--standing still, falling forward and touching ground. Compared to the experiment effect in (Wen-De et al., 2010), the multi-level key poses in this paper realized the synchronization process of all links of humanoid robot.

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 3(a-f): Dance effects of key poses of, (a) 1st, (b) 2nd, (c) 3rd, (d) 4th, (e) 5th and (f) 6th pose

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 4(a-i): Comparison effects of corresponding angles of, (a) RshoulderRoll, (b) RshoulderPitch, (c) RkneePitch, (d) RhipRoll, (e) RhipPitch, (f) RelbowRoll, (g) RelbowRoll, (h) RankleRoll and (i) RanklePitch angle

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 5(a-f): Simulating effects of key poses of, (a) 1st, (b) 2nd, (c) 3rd, (d) 4th, (e) 5th and (f) 6th pose

Image for - Study of Key Pose of Movement Similarity on Humanoid Robot
Fig. 6(a-f): Practical effects of key poses of, (a)1st, (b) 2nd, (c) 3rd, (d) 4th, (e) 5th and (f) 6th pose

CONCLUSION

In this study, the extended architecture of obtaining and applying similar tracks are described, the definition of similarity degree through the differences in angles, angular velocities and angular accelerations between human being and humanoid robot is proposed and the synchronous control of multi-level key poses for humanoid robot are discussed. In the experiment, the humanoid robot Aldebaran Nao is used to realize the multi level dance poses of similarity movement acquired from human being.

REFERENCES

1:  Chen, P.C., 2009. Using immune network in nonlinear system identification for a 3D parallel robot. Inform. Technol. J., 8: 895-902.
CrossRef  |  Direct Link  |  

2:  Bo, Z.Q., H.B. Rong, P.S. Hao and P.Q. Shu, 2010. Complex motion planning for humanoid robot: A review. Inform. Technol. J., 9: 1270-1277.
CrossRef  |  Direct Link  |  

3:  Soltanpour, M.R., M.M. Fateh and A.R. Ahmadi Fard, 2008. Nonlinear tracking control on a robot manipulator in the task space with uncertain dynamics. J. Applied Sci., 8: 4397-4403.
CrossRef  |  Direct Link  |  

4:  Naeimi, M., M. Teshnehlab, M. Aliyari Sh and M. Aliasghary, 2009. Stable direct adaptive control as nonlinear hybrid controller for flexible manipulator. J. Applied Sci., 9: 1258-1266.
CrossRef  |  Direct Link  |  

5:  Arbaoui, F., M.L. Saidi, S. Kermiche and H.A. Abbassi, 2006. Identification and trajectory control of a manipulator arm using a neuro-fuzzy technique. J. Applied Sci., 6: 2275-2280.
CrossRef  |  Direct Link  |  

6:  Motlagh, O., 2011. An FCM-based design for balancing of legged robots. J. Artif. Intellig., 4: 295-299.
CrossRef  |  

7:  Moldenhauer, J., I. Boesnach and T. Beth, 2005. Analysis of human motion for humanoid robots. Proceedings of the International Conference on Robotics and Automation, April 18-22, 2005, Barcelona, Spain, pp: 312-317

8:  Huang, Q., Z. Yu and W. Zhang, W. Xu and X. Chen, 2010. Design and similarity evaluation on humanoid motion based on human motion capture. Robotica, 28: 737-745.
Direct Link  |  

9:  Ruchanurucks, M., S. Nakaoka, S. Kudoh and K. Ikeuchi, 2011. Generation of humanoid robot motions with physical constraints using hierarchical B-spline. Proceedings of the International Conference on Intelligent Robots and Systems, January 17, 2011, San Francisco, USA., pp: 674-679

10:  Xiao-Jun, Z., H. Qiang, P. Zhao-Qin, Z. Li-Ge and L. Ke-Jie, 2005. Kinematics mapping of humanoid motion based on human motion. Robot, 27: 358-361.

11:  Takano, W., K. Yamane and Y. Nakamura, 2007. Capture database through symbolization, recognition and generation of motion patterns. Proceedings of the International Conference on Robotics and Automation, April 10-14, 2007, Roma, Italy, pp: 3092-3097

12:  Kim, S., C.H. Kim and J.H. Park, 2006. Human-like arm motion generation for humanoid robots using motion capture database. Proceedings of the International Conference on Intelligent Robots and Systems, May 2006, Beijing, China, pp: 3486-3491

13:  Wen-De, K., C. Gang, H. Bing-Rong, C. Ze-Su and Y. Quan-De, 2010. On biped walking of humanoid robot based on movement similarity. Robot, 32: 766-772.

14:  Wen-De, K., C. Gang, H. Bing-Rong, C. Ze-Su, P. Song-Hao and Z. Qiu-Bo, 2006. Falling forward of humanoid robot based on similarity with parametric optimum. Acta Automatica Sinica, 37: 1006-1013.
Direct Link  |  

15:  Liu, H.Y., W.J. Wang, R.J. Wang, C.W. Tung, P.J. Wang and I.P. Chang, 2011. Image recognition and force measurement application in the humanoid robot imitation. IEEE Trans. Instrum. Meas., 61: 149-161.
CrossRef  |  

16:  Zhixiang, T., W. Hongtao and F. Chun, 2011. Target capture for free-floating space robot based on binocular stereo vision. Inform. Technol. J., 10: 1222-1227.
CrossRef  |  Direct Link  |  

17:  Hapsari, G.C. and A.S. Prabuwono, 2010. Human motion recognition in real-time surveillance system: A review. J. Applied Sci., 10: 2793-2798.
CrossRef  |  

18:  Liu, H.Y., W.J. Wang and R.J. Wang, 2011. A course in simulation and demonstration of humanoid robot motion. IEEE Trans. Educ., 54: 255-262.
CrossRef  |  

19:  Wang, R.J., J.W. Zhang, J.M. Xu and H.Y. Liu, 2009. The multiple-function intelligent robotic arms. Proceedings of the IEEE International Conference on Fuzzy Systems, August 20-24, 2009, Jeju Island, South Korea, pp: 1995-2000
CrossRef  |  

20:  Ramos, O.E., L. Saab, S. Hak and N. Mansard, 2011. Dynamic motion capture and edition using a stack of tasks. Proceedings of the 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids), October 26-28, 2011, Bled, Slovenia, pp: 224-230
CrossRef  |  

21:  Albrecht, S., K. Ramirez-Amaro, F. Ruiz-Ugalde, D. Weikersdorfer, M. Leibold, M. Ulbrich and M. Beetz, 2011. Imitating human reaching motions using physically inspired optimization principles. Proceedings of the 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids), October 26-28, 2011, Bled, Slovenia, pp: 602-607
CrossRef  |  

22:  Boutin, L., A. Eon, S. Zeghloul and P. Lacouture, 2010. An auto-adaptable algorithm to generate human-like locomotion for different humanoid robots based on motion capture data. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan, pp: 1256-1261
CrossRef  |  

23:  Do, M., J. Romero, H. Kjellstrom, P. Azad, T. Asfour, D. Kragic and R. Dillmann, 2009. Grasp recognition and mapping on humanoid robots. Proceedings of the 9th IEEE-RAS International Conference on Humanoid Robots, December 7-10, 2009, Paris, France, pp: 465-471
CrossRef  |  

©  2021 Science Alert. All Rights Reserved