Keywords

1 Introduction

Virtual reality technology aims at delivering experience scenarios of contents to users’ experiences naturally. In order to achieve the purpose, we study interface technologies that realize virtual space simulation and multisensory feedback at a high level of realism. The human sensory feedback process is naturally acquired as it grows. However, the sensory quality level of the HMD, which currently represents the VR interface technology, does not reproduce equally the feedback generated in the human visual sense organization. Thus, primarily, the visual satisfaction and the immersion feeling for the content are reduced. A secondary phenomenon is the induction of cognitive outcomes that differ from the actual situation, resulting in distortion and performance degradation of the work results and causing a negative after-effect during use (e.g. dizziness) or after use (e.g. a headache) have.

This study has a goal to solve the real - virtual space discrepancy problem that occurs when HMD is used. Specifically, when the user directly manipulates the object in the near-body space, it can support the accurate interaction while recognizing the same distance perception as the real space. In order to achieve the goal, we quantitatively measure behavioral characteristics of various user groups according to VR system environmental conditions when we reproduce general interaction that can be experienced in real space in virtual space. Then, we develop an algorithm that corrects visualization related parameters of virtual reality contents to reduce the error so that the given task can be successfully performed in the experiment.

Generally, when working on objects located at over distance of more about 1–2 m from the user, there is no physical direct interaction phenomenon (e.g. contact and collision) due to user’s actions. In this case, it is important to express the relative interaction results between virtual objects generated in the environment outside the user, rather than considering the user-oriented sensory perception such as proprioception by body motion. However, since our research targets the direct manipulation of near-body space, we developed a technology to control the virtual reality contents so that the user’s sense of behavior and sensory stimuli formed in the real space are the same in virtual space.

2 Related Works

Human depth perception is determined by the interaction of multiple factors. One of the factors that recognize the depth of 3D space by the physiological structure of visual organs is disparity and convergence by binocular vision. In addition, factors that perceive depth from experience with a single eye are the focus, perspective, occlusion, lighting & shading, color intensity & contrast, and relative movement. Since the above factors are combined in the HMD experience environment, the interaction in the virtual space can be performed similarly to the real space only if the user is aware of the accurate depth/space.

We know that users get different experiences of distance from reality space in virtual space through past research cases [1,2,3,4,5,6] and many cases of HMD based virtual reality contents that are spread to the market. Renner et al. [7] suggested a solution to the problem that the subjectively perceived distance in the virtual environment becomes shorter than the real environment; they have reported that a rich virtual environment representation, including possible accurate disparity and high-quality graphic images, careful virtual camera setup, and floor textures to help sense distance, helps improve user sense of distance. Siegel [8] pointed out that the validity of the training or research performed in the virtual space can be questioned because of the tendency of the distance perception to be underestimated in the virtual reality space. To correct the underestimated distance, Siegel proposed a universal interaction task through interaction experiment. Altenhoff [9] also pointed out that the closer the distance is, the higher the degree of underestimation of distance is, and it is reported that it can be overcome by correction. Ziemer et al. [10] studied the relationship between the act of measuring distance perception in real and virtual environments and the effect of distance perception in other environments. Yang et al. [11] propose a technology that naturally connects the stereo 3D distance perception of reality and virtual space by controlling the hardware design parameters and propose a new EGD structure for direct interaction in near-body-space.

3 Egocentric Perception and Interaction in Near-Body Space

In this study, as a representative example in which a user interacts directly in a free space, we selected the situation where the key is inserted into the hole as shown in Fig. 1 among service scenarios of HMD based indoor experience type first-person-shooting (FPS) contents.

Fig. 1.
figure 1

Experiment design: Generalizing near-body space interaction in a real environment to peg-in-hole (target matching) task in a virtual environment with haptic feedback.

3.1 Experiment Design

To provide the same spatial feeling as the real space, the IPD (Inter-Pupil Distance) control part of the HMD hardware is changed to correspond to the measured IPD of the subject. First, the virtual camera of the real-time 3D rendering engine sets the default value of the hardware and generates the virtual space image reflecting the position of the IPD and the viewpoint of the subject. We also investigated the relationship between the GFOV (Geometric Field Of View), which is a software control element of HMD driving, and the distance perception in the near-body space interaction.

In order to compensate for the position discrepancy between the actual three-dimensional position space and the virtual space constituted by the tracking system, all the objects to be manipulated are placed in a constant tracking space by attaching tracking sensors to all mutual objects. As in the center part of Fig. 1, a number of keyholes, which are the target points, are placed within the field of view range of about 110° at a distance of 120 cm. The height of the keyhole was selected to be 116.4 cm, which is the height between the elbow and the armpit of an adult in the early 20 s, referring to the National Physical Size Database (Size Korea; sizekorea.kr) of the National Statistical Office. A pin art array device providing pseudo-haptic feedback was placed in order to provide a realistic feel to the subject’s peg-in-hole task on the three-dimensional space that varies depending on various experimental conditions. Such as the bottom of Fig. 1, the subject is asked to perform a task of correctly pegging the bar-shaped key in a number of holes presented in an environment wearing an HMD. The Oculus Rift CV1 and three IR camera sensors were used to construct the experimental environment. And, the images of the experiment contents were changed to 98%, 100%, and 102% by controlling the world-to-meters parameters using the version 4.15 of UnrealEngine.

The goal of this experimental study is to determine whether the accuracy of the interaction (peg-in-hole) is changed by changing the GFOV (UnrealEngine’s world-to-meters) and changing the user’s visual distance. The subjects were 10 university students (7 male, 3 female), who have experience using HMD with the within-subject test.

3.2 Results and Discussions

The results of the ANOVA test showed that there was a significant correlation between the world-to-meters parameter and the interaction distance error of the subjects as the Table 1.

Table 1. ANOVA result (Left: HTC VIVE, Right: Oculus Rift CV1)

In order to apply the significance of the ANOVA analysis results to the improvement of interaction performance, regression analysis of distance error and world-to-meters parameter was performed and, although it is somewhat low fit(R2), the regression line can be obtained as shown in the following Fig. 2.

Fig. 2.
figure 2

Regression analysis between distance error and world-to-meters parameter

When performing direct manipulation interactions within the near-body space range in the HMD environment, the value of world-to-meters was inversely calculated to make the distance error zero. The results are about 106.8 for HTC VIVE, and about 107.5 for Oculus Rift CV1. Therefore, it can be seen that it is helpful to the over-map world-to-meters parameter by about 7% for UnrealEngine. As shown in Fig. 3, the user interaction performance was improved by 83% in the short distance (40–60 cm) through 6 verification experiments by applying the corresponding correction values.

Fig. 3.
figure 3

Comparison of interaction distance error before and after calibration

4 Conclusions and Future Works

We propose a method to control the virtual camera parameters of the 3D engine so that it can help precise distance sense and perform accurate interaction when a user in a virtual space wearing an HMD manipulates a virtual object associated with real space. We propose a method of over mapping the world-to-meters parameter of the UnrealEngine, which is similar to the GFOV effect, to about 107% in order to compensate for the erroneous perception of near-body space. In addition, we observe that more than 80% of users increase the interaction accuracy. However, this study is a result of initial research on a limited scale group of subjects and hardware and software. Therefore, it needs to be expanded to research on various control elements for various HMD, tracking system and 3D visualization engine in the future.