HOME JOURNALS CONTACT

Journal of Applied Sciences

Year: 2011 | Volume: 11 | Issue: 12 | Page No.: 2145-2153
DOI: 10.3923/jas.2011.2145.2153
Application of a Virtual Reality Entertainment System with Human-Machine Sensor Device
Kuei Shu Hsu

Abstract: In this study, an interactive virtual reality motion simulator is designed and analyzed. The main components of the system include a bilateral control interface, networking, a virtual environment and a motion simulator. Virtual reality entertainment system uses a virtual environment that enables the operator feel the actual feedback and distorted by human motion sensing device, from the virtual environment, as he/she will be in the real environment. The control scheme for the simulator uses the change in velocity and acceleration that the operator imposes on the joystick, the environmental changes imposed on the motion simulator and the sensor device feedback to the operator to maneuver the simulator in the real environment. In addition, we develop a calculating method to evaluate the ratio of the simulation results. It is shown that the proposed control scheme can improve the performance of the visual entertainment system. Experiments are conducted on the virtual reality entertainment system to validate the theoretical developments.

Fulltext PDF Fulltext HTML

How to cite this article
Kuei Shu Hsu , 2011. Application of a Virtual Reality Entertainment System with Human-Machine Sensor Device. Journal of Applied Sciences, 11: 2145-2153.

Keywords: Active platform, space entertainment system, motion simulator system, virtual reality and sensor device

INTRODUCTION

Virtual Reality (VR) systems have become an integral part of many industrial sectors such as aviation, medical, manufacturing (Basdogan et al., 2007; Hsu et al., 2008; Li and Wang, 1998). VR systems have been used to train and educate people in the field for economic or other justifiable reasons as in the case of hazardous sites. For example, in-flight simulators have been extensively used to train pilots and would be pilots because human errors could be catastrophic in real situations. In the medical field, VR systems have been used to train doctors to diagnose diseases without intrusive diagnostic procedures (Haque and Srinivasan, 2006). Most VR systems make use of haptic interfaces which allow the operator feel as if s/he is dealing with a physical environment (Her et al., 2002). The interaction between a human operator and machines is achieved through an intermediary device known as “Sensor device” (Basdogan et al., 2004). This device can be programmed to impose arbitrary trajectory-dependent forces on the operator arm. The haptic device has a force feedback characteristic allowing operators to feel the contact on the palm of the hand by grasping a handle or grip. The use and utility of such devices have been on the rise due to the advancements in various technological sectors. The elements of the force feedback mechanism such as connecting rod, linkages and encoders are mainly passive since neither the controlled object nor the virtual environment is providing any force feedback to the operator. However, the haptic device is power actuated and provides a force feedback to the operator. Kwon et al. (1999) used a haptic interface device for a manpower amplifier. Bielser et al. (2004) realized tele-operation for the VR system with haptic characteristics that allows an operator to probe and feel a remote virtual environment.

Haptic devices come in different sizes and shapes; a small pen-based interface arrangement has been used to create freeform surfaces or simulate surgical tools. Robot manipulators have been used as the force interface (Magnenat-Thalmann and Bonanni, 2006). Sierra et al. (2006) designed a haptic device that uses a specialized robot manipulator to apply force feedback between the user and the VR environment. Chen et al. (2009a) developed a model for a haptic interface device which can give the operator a feel that s/he is maneuvering a mass, or pushing onto a spring or a damper. Li and Wang (1998) modeled force/torque sensing for a working environment using physical based components (e.g., mass and spring/damper) in the VR simulation system. The haptic interface used in the experiment helped the operator feel the reaction forces from the slave environment. From past experience, the use of submarine simulation design system in order to achieve interactive virtual reality environment.

The slave robot is replaced with a 2-D motion simulator system and the x-y table is replaced with a joystick. The system requires controllers to drive the seat of the simulator as well as the haptic interface. The advantage of the control scheme presented in this study is that the dynamics of the human arm, the actuators and the environment are included in the closed-loop control model for stability analysis (Her et al., 2002).

The aim of this study is to propose a new method about the VR Ratio of the real haptic motion simulator and virtual one (the transfer function between real and virtual motion simulator) as shown in Fig. 1 and 2. Therefore, it is not necessary to waste lots of time in adjust the motion simulator by virtual environment anymore. In this study we introduce how to use the motion simulator with a limited space to simulate an unlimited virtual environment and discuss the relationships with acceleration, velocity, displacement, virtual environment or the motion simulator.

The whole system combines the remote control robotic system, the virtual reality system, the VR Ratio of the real motion simulator and these dynamic subsystems form a complex multiple bilateral problem of control.

Virtual Reality combines human’s feeling with machine that called “Man Machine Interface”. Human operators can make interactions with virtual reality systems through the sense of sight, hearing, touch and action. Because of the fast development of computer technology and man-machine interface device, virtual reality has been used in various applications, e.g., entertainment, education, medical science, military, flight simulation, remote manipulation of mission and many kinds of training.

Fig. 1: One-axis motion simulator for space entertainment system

Fig. 2: The software simulation display of virtual environment system

These make virtual reality becomes an inter-discipline research topic. There is no denying that the applications of virtual reality are unlimited and the designer’s imagination is one of the most important elements to make creation come true (Chen et al., 2009b).

Human operators will immerse in virtual environment through the Man-Machine Interface. It is different from virtual reality and generally multimedia, because we do not only watch but just think that everything is true!

SYSTEM DESCRIPTION

In order to simulate an unlimited space in virtual environment, one should create a new way to make it possible. Although the motion simulator has a limited space, one can define a filter to determine the maximum movement of the motion simulator to limit its maximum moving range. All these conditions are based on a situation that if acceleration occurs in the virtual environment, then the human on the motion simulator should feel acceleration as well. Owing to the limited space of the motion simulator, if the virtual environment undergoes an uniform velocity, the chair on the motion simulator will be back to the original position. Due to the different space between the motion simulator and the virtual environment, one can divide it into three cases:

Assume that the proportion of virtual space and reality space is K, it means if the virtual space is unlimited and the reality space is limited. Therefore, we could obtain the range of value K and try the best value to get better feeling of the operator.

The feelings contain lots of sense and consciousness. Among which sense of sight comes from variation of virtual environment and sense of moving comes from the motion simulator just like Fig. 3. Human operator can feel 1:1 acceleration and displacement between sense of sight and virtual environment. But sense of moving will appear K-fold relation with the motion simulator and must be in the same phase.

The diagram of the multiple bilateral control system is shown in Fig. 4. The operator can feel force feedback from Thj by joystick and Ths by the motion simulator. Gs and Gj are the close-loop controllers. Tej and Tes are the transfer functions of virtual environment dynamics; Kej and Kes are the transfer functions for Fj and Fs. Tej, Tes, Kej and Kes offer the vision feedback to the operator.

Fig. 3: Vision and moving feeling

Fig. 4: Diagram of the VR entertainment system

Obviously, there are two systems Yj (for joystick) and Ys (for simulator) in the diagram that will interact. This system model includes a bilateral control interface between two haptic devices and a virtual environment.

In Fig. 5, when the operator manipulates the joystick, he would receive force feedback from the joystick and moving feeling from the motion simulator and vision from virtual environment. In the meantime, velocity data of the joystick would be sent to virtual environment. Then the motion simulator will move according to the calculated data through Ks/s+a which is the transfer function Khs in Fig. 4. However, if the motion simulator moves acutely, it will let the joystick sway. In this case, it is assumed that the operator is on the motion simulator and holds the joystick. It means the operator’s body is like a shock absorber which won’t let the joystick keep shaking. Hence, it is needed to adjust the gains K and a to remove these noise, otherwise the reciprocal effects of the shaking joystick and moving simulator will make the whole system unstable.

CONTROL ANALYSIS

Acceleration of virtual environment and the motion simulator: Based on this conception, this study assumes that joystick’s variation of displacement represents the velocity of virtual environment. The transfer function between the motion simulator and the virtual environment is assumed to be of the form Ks/s+a just like a filter shown as Fig. 6. Now, if one accelerates the speed gradually, that means the joystick moves slowly, so the low frequency makes S very small. Thus, s/s+a will approach to “0”. Under this situation, the chair on the motion simulator won’t change its displacement and the human operator feels the speed just by the sense of sight from the virtual environment. In contrast, if one accelerates the speed instantly, that means the joystick moves very fast and the high frequency makes s very big so that s/s+a will approach to “1”. That is, there is a K-fold relation between the motion simulator and the virtual environment.

Acceleration of the virtual environment (Av):


Acceleration of the motion simulator (AR):

The ratio of the motion simulator and the virtual environment is described by:

From Fig. 7 and 8, the case of high frequency, s approaches to infinity, then s/s+a approaches to “1”, AR/AV → K-fold (As joystick moves instantly.). For the case of low frequency, s is very small, s/s+a approaches to “0” and AR/AV → 0 (Human operators feel acceleration by sense of sight). Therefore, in the case of high frequency, the ratio AR/AV approaches K-folds as the joystick moves. However, for low frequency, AR/AV approaches zero. Therefore, the operator will sense the acceleration through vision. From Fig. 9, one can see that the ratio AR/AV exhibit a proportional distribution.

In Fig. 10, it is observed that AR/AV exhibits a proportional distribution. And the Bode diagram of s/s+a shows that the magnitude approaches to 1 at high frequency; and no matter how much the operator accelerates or decelerates, the motion simulator will be in the same phase.

Fig. 5: System frame

Fig. 6: Simulation about acceleration of the simulator motion

Fig. 7: Acceleration of virtual environment

Namely, when the operator accelerates forward with a joystick, the motion simulator won’t go back to the opposite direction.

Fig. 8: Acceleration of the motion simulator

Fig. 9: AR/AV exhibits a proportional distribution

Fig. 10: Bode diagram of s/s+a

Displacement of virtual environment and the motion simulator: The joystick’s variation of displacement is set to the velocity of virtual environment and this velocity passes through an integrator, in which it will become the displacement of the virtual environment. And if it passed through an integrator and a filter s/s+a and also is multiplied by K, one could obtain the displacement of the motion simulator as follows:

where,

thus,

It is already known that the maximum displacement of the motion simulator is XR max, then K can be computed. The time response of the VR environment as well as the motion simulator due to a joystick command is shown in Fig. 11.

3D aircraft case: In this section, one can control the airplane by keyboard or joystick. When the airplane collides with something in the game, the joystick will vibrate. The flight data about velocity and acceleration are indicated by the program interface. If we consider the resistance of water, we could simulate a submarine system shown as Fig. 12 and as follow:

mx = F–bx

If the throttle is set to accelerate where,

F = κδ

Then we can obtain:

Therefore, we can derive two system equations that relate the displacement and velocity of submarine with δ (s):

or expressed in time-domain:

If t→∞, then:

where, κΔ/b is a terminal velocity.

Besides, because of:

where, K is a scaling factor.

Fig. 11: Observe the displacement of virtual environment and motion simulator

where, K is a scaling factor.

Thus:

And the maximum of xR can be written as:

Because of the length limit of the motion simulator, Fig. 13, xRmax must be less than xlimit, thus:

then:

So when K = 1, the human operator will feel the 1:1 reality between the real displacement and the visional displacement.

EXPERIMENT

The research project was conducted under the supervision of Department of Applied Geoinformatics, Chia Nan University of Pharmacy and Science, Taiwan 717, Republic of China. The Research Project was sponsored by National Science Council, Taiwan, R.O.C. under NSC 98-2221-E-041-009. This research project was conducted from June 15, 2009 to August 31, 2010.

Fig. 12: Aircraft case

Fig. 13: The real displacement of the chair on the motion simulator

Fig. 14: Hardware configuration of the experimental system

Fig. 15: The software of the virtual environment system for the one-axis motion simulator

The hardware configuration of the experimental system is illustrated in Fig. 14, where it mainly consists of the force feedback joystick, the motion simulator and the eye-trek. This multiple bilateral control system integrates servo control, mechanical design, dynamic analysis, graphic interface and virtual reality system. Several experiments are conducted to validate the theoretical developments.

The hardware system in Fig. 14 mainly contents the force feedback joystick, the motion simulator and the eye-trek. This multiple bilateral control system integrates servo control, mechanical design, dynamic analysis, graphic interface and virtual reality system. And several experiments were designed to validate the theoretical developments.

There are two major parts in the system. One is considered the property of transfer function Ks/s+a without auto-back function. And, experiments with different K, a, frequency and operators were conducted to obtain the best VR ratio of the real haptic motion simulator and virtual one. The other is the influence of auto-back function of the system.

In Fig. 15a, the system is on low frequency without auto-back function that one can notice the variation of the virtual simulator in virtual environment and there is a uniform velocity at intermediate zone. Two situations were set at K = 0.8 and a =1, K = 2 and a = 4; because these two situations have better experimental results for feeling of the operator with lower unstable noise. It is observed that if the operator accelerates slowly, the motion simulator also accelerates slowly but the slope is small. When the virtual simulator keeps at uniform velocity, the motion simulator will decelerate until its velocity equals to zero. Then the operator decelerates slowly, although the virtual simulator still goes forward, the motion simulator moves back instead. Because the property of transfer function Ks/s+a simulates the accelerated feeling (direction of force) of human, it is assumed that the operator is in a real car. As a car accelerates forward, the operator feels a force backward. So the motion simulator can accelerate forward. If a car decelerates, the operator will feel a force forward. Thus, the motion simulator decelerates backward instead. The key point is how can one create a best feeling by adjusting value K and a for obscuring the operator to change the moving direction of the motion simulator.

It is obtained that the displacement of the motion simulator by integrating its velocity in Fig. 15b. Although a virtual car still goes forward in the virtual environment, the real displacement of the motion simulator would be back to its original place.

CONCLUSION

Through Ks/s+a, it is certainly not needed to waste time to adjust the motion simulator by virtual environment anymore and it simulates the haptic feedback for accelerated feeling of the operator successfully. The whole system performance and stability by varying the gains of the transfer function can then be regulated. The effects of simulation swiftly by calculating the VR Ratio can thus be obtained and evaluated. A haptic motion simulator is designed for virtual reality applications. The first system is designed assuming linear motion in the x-direction only. A dynamic model for the one-dimensional simulator system with haptic behavior which includes the dynamics of the human arm and the environment is presented in this study. It is shown that the stability and the desired performance of the closed-loop control system are achievable through a simple filter transfer function. The theoretical and experimental findings were extended to a two degree of freedom simulator assuming the degrees of freedom are independent. The motion simulator system was used with a virtual environment and the results were very satisfactory.

REFERENCES

  • Basdogan, C., M. Sedef, M. Harders and S. Wesarg, 2007. VR-based simulators for training in minimally invasive surgery. IEEE Comput. Graphics Appl., 27: 54-66.
    CrossRef    Direct Link    


  • Basdogan, C., S. De, J. Kim, M. Manivannan, H. Kim and M.A. Srinivasan, 2004. Haptics in minimally invasive surgical simulation and training. IEEE Comput. Graphics Appl., 24: 56-64.
    CrossRef    


  • Bielser, D., P. Glardon, M. Teschner, 2004. A state machine for real-time cutting of tetrahedral meshes. Graphical Models, 66: 398-417.
    CrossRef    


  • Chen, K.C., K.S. Hsu and M.G. Her, 2009. A real-time vision tracking system using human emulation device J. Applied Sci., 9: 2861-2876.
    CrossRef    


  • Chen, K.C., K.S. Hsu, M.G. Her and M.C. Chiu, 2009. Application of the two-axle robot tracing object system with multithread control technology. Inform. Technol. J., 8: 39-48.
    CrossRef    Direct Link    


  • Haque, S. and S. Srinivasan, 2006. A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans. Inform. Technol. Biomed., 10: 51-58.
    CrossRef    


  • Her, M.G., K.S. Hsu and W.S. Yu, 2002. Analysis and design of a haptic control system: Virtual reality approach. Int. J. Adv. Manuf. Technol., 19: 743-751.
    CrossRef    Direct Link    


  • Hsu, K.S., C. Ko-Chun, L. Tsung-Han and C. Min- Chie, 2008. Development and application of the single-camera vision measuring system. J. Applied Sci., 8: 2357-2368.
    CrossRef    Direct Link    


  • Kwon, D.S., K.Y. Woo and H.S. Cho, 1999. Haptic control of the master hand controller for a microsurgical telerobot system. Proc. IEEE Int. Conf. Robotics Automat., 3: 1722-1727.
    CrossRef    


  • Li, Y.F. and J.G. Wang, 1998. Incorporating dynamic sensing in virtual environment for robotic tasks. Proc. IEEE Instrumentation Meas. Technol. Conf., 1: 123-127.
    CrossRef    


  • Magnenat-Thalmann, N. and U. Bonanni, 2006. Haptics in virtual reality and multimedia. IEEE Multimedia, 13: 6-11.
    CrossRef    


  • Sierra, R., G. Zsemlye, G. Szekely and M. Bajka, 2006. Generation of variable anatomical models for surgical training simulators. Med. Image Anal., 10: 275-285.
    CrossRef    

  • © Science Alert. All Rights Reserved