ICAT-EGVE2019 - Posters and Demos
Permanent URI for this collection
Browse
Browsing ICAT-EGVE2019 - Posters and Demos by Subject "Human"
Now showing 1 - 15 of 15
Results Per Page
Sort Options
Item Airflow Presentation Method for Turning Motion Feedback in VR Environment(The Eurographics Association, 2019) Suzuki, Yujin; Yem, Vibol; Hirota, Koichi; Amemiya, Tomohiro; Kitazaki, Michiteru; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiThis paper describes the effectiveness of airflow presentation to reduce VR sickness and induce vection during turning motion. Five airflow displays were placed surrounding the face and the angle interval of each display was 45 °. Each has 0.6 m distant from the face. Two directions of turning motion: left and right, were visually presented. Result showed that airflow from any direction could reduce VR sickness. Moreover, we confirmed that the airflow presented in 45 ° from in front direction to the turning direction (left or right) enhanced the perception of vection.Item Augmented Dodgeball AR Viewer for Spectators(The Eurographics Association, 2019) Azuma, Shota; Hertzog, Clara; Sakurai, Sho; Hirota, Koichi; Nojima, Takuya; Kakehi, Yasuaki and Hiyama, AtsushiThese last few years many systems and methods have been developed to provide information to spectators about a sport game such as baseball, basketball, soccer, etc. Among them, Augmented Sport is one of the emerging area that intends to merge video game concept into physical sports. This project focuses on merging game elements such as Health Points (HP), Attack Power (AP) and Defense Power (DP) to improve enjoyment and variety of players. During Augmented Dodgeball games, the spectators can visualize via the Mixed Reality device additional parameters such as HP, AP and DP of each player. This data is superimposed onto each physical players by virtue of AR markers they wear. To avoid marker occlusion issues, fixed camera(s) are also used to inquire player's physical information and share it via a database. Studies have been conducted in order to find out the best displaying design, methods and limits of the system.Item Difference of the Sense of the Texture Between Visual and Touchable Cloth Object in VR Space(The Eurographics Association, 2019) Tsukuda, Yoshio; Koeda, Masanao; Kakehi, Yasuaki and Hiyama, AtsushiIn a virtual reality (VR) space, the texture of cloth is perceived visually using the influence of three-dimensional computer graphics (3DCG) representing the cloth object surface material and movement. In recent years, research has been conducted on various dynamics-based cloth simulations in 3DCG. However, the relation between the mechanical parameters of the generated cloth and the texture of the cloth has not been clarified, and instead is implemented based on the sense of the 3DCG developer. In this study, we examine the differences of the texture sense between the visual texture of cloth objects and the texture when actively touching the cloth objects in a VR space, by changing mechanical parameters of the cloth. The experimental results with 10 subjects using a semantic differential method showed that the subjects did not feel a clear texture with most adjective pairs in the visual texture only. The subject obtained the texture more clearly by actively touching the cloth with no haptic feedback. It also was revealed that with some mechanical parameters, the sense of texture can be reversed between the visual-only feeling and the active touch of the cloth.Item Expanding the Freedom of Eye-gaze Input Interface using Round-Trip Eye Movement under HMD Environment(The Eurographics Association, 2019) Matsuno, Shogo; Sato, Hironobu; Abe, Kiyohiko; Ohyama, Minoru; Kakehi, Yasuaki and Hiyama, AtsushiIn this paper, we propose a specific gaze movement detection algorithm, which is necessary for implementing a gaze movement input interface using an HMD built-in eye tracking system. Most input devices used in current virtual reality and augmented reality are hand-held devices, hand gestures, head tracking and voice input, despite the HMD attachment type. Therefore, in order to use the eye expression as a hands-free input modality, we consider a gaze input interface that does not depend on the measurement accuracy of the measurement device. The proposed method generally assumes eye movement input different from eye gaze position input which is implemented using an eye tracking system. Specifically, by using reciprocation eye movement in an oblique direction as an input channel, it aims to realize an input method that does not block the view by a screen display and does not hinder the acquisition of other lines of sight meta information. Moreover, the proposed algorithm is actually implemented in HMD, and the detection accuracy of the roundtrip eye movement is evaluated by experiments. As a result, the detection results of 5 subjects were averaged to obtain 90% detection accuracy. The results show that it has enough accuracy to develop an input inter-face using eye movement.Item Footstep Sound for Suppression of VR Sickness and Promotion of Sense of Agency(The Eurographics Association, 2019) Nashiki, Reon; Yem, Vibol; Amemiya, Tomohiro; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiProviding realistic bodily sensation in a virtual reality (VR) space is crucial for natural integration of multisensory information that we receive in a real space. In the present paper, we consider to use auditory stimulus to enhance bodily sensation as an indirect representation of a body in the VR space. Three levels of visually presented virtual locomotion conditions using a head mounted display (HMD) and four levels of footstep sound stimulus were evaluated regarding VR sickness and the sense of agency. The result showed that the footstep sound decreased both of VR sickness and the discomfort level of the visual presentation of moving down a virtual corridor when the footstep sound was synchronized with the visual stimulus. The sense of agency was also increased by synchronized footstep sound presentation.Item Generation of Walking Sensation by Upper Limb Motion(The Eurographics Association, 2019) Sueta, Gaku; Saka, Naoyuki; Yem, Vibol; Amemiya, Tomohiro; Kitazaki, Michiteru; Sato, Makoto; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiThis paper proposes a method to generate a turning walk sensation to the user by an arm swing display. We assumed a hypothesis that the turning walk sensation is generated by providing different motion profiles of passive arm swing on the left and right arms. We show that turning walk sensation can be generated by presenting arm swing motion with a different flexion ratio of the shoulder joint, depending on the turning radius.Item Narrowcasting for Stereoscopic Photospherical Cinemagraphy(The Eurographics Association, 2019) Cohen, Michael; Iida, Takato; Sato, Rintaro; Kakehi, Yasuaki and Hiyama, AtsushiWe have developed an application which blurs the distinction between static and dynamic imagery in a stereoscopic omnidirectional browser. A ''cinemagraph'' is a living picture, interpolating between a still photo and a video. A stereo omnidirectional camera can capture stereographic contents. Combining such functionality yields a photospherical cinemagraph. Runtime control of activation fields allows selective alternation between frozen and animated scene elements. Narrowcasting, a user interface idiom for selective activation, is used to alternate between static and moving imagery) . Presentation includes stereoscopic display (binocular channels) and spatial sound.Item Preliminary Study on Surface Texture to Manipulate Perceived Softness of 3D Printed Objects(The Eurographics Association, 2019) Miyoshi, Motoki; Punpongsanon, Parinya; Iwai, Daisuke; Sato, Kosuke; Kakehi, Yasuaki and Hiyama, AtsushiPrevious studies have attempted to manipulate the elastic properties of products from elements such as different materials and internal structures. In this paper, we investigate whether we can manipulate the softness perceived by the surface texture when using the FDM-3D printer. We investigated the perceived softness of the surface texture provided by Tymms et al., in which cones of 1 mm in height are arranged, by a subject experiment. From the experimental results, it was found that the hardness perceived by increasing the arrangement interval of the cones decreased and the subjects perceived softer the objects with the surface texture.Item Production of Instructional Videos Using a Virtual Presentation Room on a Mobile Head-mounted Display(The Eurographics Association, 2019) Yano, Kojiro; Kakehi, Yasuaki and Hiyama, AtsushiInstructional videos are frequently used in education and it is important for teachers to produce the materials with minimal time and efforts. In this article, I report SlidesGo, a "virtual presentation room" software for a mobile head-mounted display. It allows a user to record a slide presentation video with an avatar in a virtual environment as well as to record head orientations during a presentation. and produce a heatmap to characterize the visual attention of the speaker during the recording. Instructional video productions using a mobile head-mounted display is far more time and cost efficient and give more insights into the teacher's non-verbal behaviour than those in a physical studio.Item System for Body Motion Capture While Moving in Large Area(The Eurographics Association, 2019) Yuasa, Yusuke; Tamura, Hideki; Yem, Vibol; Amemiya, Tomohiro; Kitazaki, Michiteru; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiPrevious motion capture device such as OptiTrack has actively used. However, it is suitable to measure in a narrow space. We proposed a system to measure the body motion while walking in a large area. In the system we attached OptiTrack, Depth sensor and 9-axis sensor to a mobile vehicle. This paper reports the potential of our system.Item Vehicle-Ride Sensation Sharing for Immersive Remote Collaboration with Vestibular Haptic Chair to reduce VR Sickness(The Eurographics Association, 2019) Morita, Tsubasa; Yem, Vibol; Amemiya, Tomohiro; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiWe proposed a telepresence system presenting vehicle-ride sensation in real time for remote collaborative tasks. We used a Segway, a personal vehicle for a local driver, and a rotary chair with vestibular haptic feedback for an expert who remotely attends the task. The telepresence system will enable an expert to collaborate remotely with a local driver regarding a highly professional local surveillance task. We conducted a preliminary test on the feedback system design using a rotary seat built for the evaluation. The result showed that the participants adjusted the angular acceleration of the rotary seat at about a half of the angular acceleration of the camera in motion. The seat rotation needed to be in-phase with the rotation of the camera to reduce VR sickness.Item Visual Presentation For Sports Skill Lerning in VR(The Eurographics Association, 2019) Miyashita, Fumiya; Amemiya, Tomohiro; Kitazaki, Michiteru; Kasamatsu, Keiko; Yem, Vibol; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiThis paper describes the viewpoint suitable for sports training in virtual reality (VR). We compared first-person and third-person view in the accuracy of cognitive simulation and reproduction of the body part trajectory. From the third-person view, the participants were able to understand 66% of the whole body's movement, and from the first-person view, they were able to understand 52%. However, when observing complex movement such as position grasp of a forearm, the third-person view enabled memorization of the position significantly better than the first-person view. It was suggested that the viewpoint needs to be changed depending on the features of the sports.Item Visual Search of Interactive Gaze in a Virtual Environment: Detecting Eye Contact is Faster than Gaze Averting(The Eurographics Association, 2019) Yamamoto, Kyosuke; Kitazaki, Michiteru; Kakehi, Yasuaki and Hiyama, AtsushiWe often gaze at each other when we communicate with others intimately. A previous study with static stimuli has revealed that perception of others' gaze is asymmetric only when the head is deviated; gazing face target is found faster than averting gaze target. However, the gaze research has been limited to static eye's stimuli, and the perceptual processing of dynamic gaze has not yet been investigated. Therefore, we created dynamic and interactive gaze stimuli of frontal heads in a virtual environment using a head mounted display that can measure eye movements, and conducted visual search experiments. We found that the gaze contacting target presented among gaze averting distractors was detected faster than the gaze averting target among gaze directing distractors. Thus, the detecting eye contact is faster than gaze averting even with frontal faces in dynamic environments, suggesting that dynamic eye contact has a special value for human perception.Item VR Sickness Reduction in Stereoscopic Video Streaming System 'TwinCam' for a Remote Experience(The Eurographics Association, 2019) Yagi, Ryunosuke; Fujie, Toi; Amemiya, Tomohiro; Kitazaki, Michiteru; Yem, Vibol; Ikei, Yasushi; Kakehi, Yasuaki and Hiyama, AtsushiIn the present paper, a method to present remote stereoscopic vision with decreased VR sickness is discussed. Our omnidirectional stereoscopic video streaming system (TwinCam) is described introducing the merit of the design. One of the important features is VR sickness reduction which we evaluated by assessing the simulator sickness questionnaire comparing it with conventional parallel cameras design. The result revealed that the TwinCam has significantly suppressed VR sickness from the conventional parallel cameras, at the same level of a fixed monocular camera.Item Wider IPD Makes People Perceive Their Body to be not so Large when Large Hands are Presented(The Eurographics Association, 2019) Mine, Daisuke; Ogawa, Nami; Narumi, Takuji; Yokosawa, Kazuhiko; Kakehi, Yasuaki and Hiyama, AtsushiIt is known that hand size and the interpupillary distance (IPD), as well as eye height from the ground, are some of the determinants of body size. We investigated the effect of simultaneous changes in hand size and IPD on size perception regarding the body and the external world in virtual reality. We manipulated the hand size and the IPD (normal hand and normal IPD, large hand and normal IPD, large hand and large IPD) while vertically increasing the participants' eye height. Our main results indicated that a wider IPD combined with larger hands made participants perceive their body to be smaller than when the IPD was normal. Thus, the IPD influences the perception of body size when large hands are presented. This is a novel result because it suggests the probability of an interaction effect between the IPD and the hand size, or between the IPD and the presence of hands, on body size perception.