[Poster-02] A framework for scalable tracking of physical objects to enhance immersive prototyping for design thinking
Pak Ming Fan, Ting-Chuen Pong
Abstract
Design thinking is regarded as one of the important problem-solving skills that include a prototyping stage for early testing the feasibility of an idea. With the advancement in technologies, virtual reality is being applied to create immersive experiences in design prototyping. While tracking physical objects and visualizing the tracking in the virtual world could enhance the immersive experiences, existing tracking devices are not easy to set up and might not scale cost-effectively in multi-user scenarios. The aim of this study is, therefore, to propose a framework for using simple and low-cost hardware for tracking physical objects in a multi-user environment. A sample implementation along with demonstrations on the tracking are also presented.
|
[Poster-03] Augmented Reality could transform last-mile logistics
Jelmer Winkel, Dragos Datcu, Paul Buijs
Abstract
The rapid growth of e-commerce is challenging parcel delivery companies to meet the demand while keeping their costs low. While drivers have to meet unrealistic targets, the companies deal with a high staff turnover percentage for the last-mile. Consequently, with every turnover, the critical experience for the last-mile is lost. In this paper, we introduce an augmented-reality based system that keeps track of events during the parcel handling process for last-mile logistics. The system can register, track parcels inside the truck to the driver and project the location in augmented form. The system, which we have integrated into a mobile application, potentially reduces processing times and accelerates the training for new drivers. We discuss first-hand observations and propose further research areas. Field-tests are currently under development to verify whether the system can operate in a real-life environment.
|
[Poster-06] Investigating the Effectiveness of Locked Dwell Time-based Point and Tap Gesture for Selection of Nail-sized Objects in Dense Virtual Environment
Shimmila Bhowmick, Ayaskant Panigrahi, Pranjal Borah, Pratul Kalita, Keyur Sorathia
Abstract
In immersive VR environments, object selection is an essential interaction. However, current object selection techniques suffer from issues of hand jitter, accuracy, and fatigue, especially to select nail-size objects. Here, we present locked dwell time-based point and tap, a novel object selection technique designed for nail-size object selection in a dense virtual environment. The objects are within arm's reach. We also compare locked dwell time-based point and tap with magnetic grasp, pinch and raycasting. 40 participants evaluated the effectiveness and efficiency of these techniques. The results found that locked dwell time-based point and tap took significantly less task completion time and error rate. It was also the most preferred and caused least effort among all the techniques. We also measured easy to use, easy to learn and perceived naturalness of the technique.
|
[Poster-08] Spatial Scale Perception for Design Tasks in Virtual Reality
Jingjing Zhang, Ze Dong, Robert Lindeman, Thammathip Piumsomboon
Abstract
We present a user study exploring spatial scale perception for design tasks by simulating different levels of eye heights (EHs) and inter-pupillary distances (IPDs) in a virtual environment. The study examined two levels of spatial scale perception of two-year-old children and adults for a chair scale estimation task. This was to provide appropriate perspectives to enable a suitable estimation of the virtual object scale for the target group during the design process. We found that the disparity between the perspective taken and the target user group had a significant impact on the resulting scale of the chairs. Our key contribution is in providing evidence to support that experiencing different spatial scale perception in VR has the potential to improve the designer's understanding of the end-user's perspective during the design process.
|
[Poster-09] Substituting Teleportation Visualization for Collaborative Virtual Environments
Santawat Thanyadit, Parinya Punpongsanon, Thammathip Piumsomboon, Ting-Chuen Pong
Abstract
Virtual Reality (VR) offers a boundless space for users to create, express, and explore in the absence of the limitation of the physical world. Teleportation is a locomotion technique in a virtual environment that overcomes our spatial constraint and a common approach for travel in VR applications. However, in a multi-user virtual environment, teleportation causes spatial discontinuity of user's location in space. This may cause confusion and difficulty in tracking one's collaborator who keeps disappearing and reappearing around the environment. To reduce the impact of such issue, we have identified the requirements for designing the substituted visualization (SV) and present four SV of the collaborator during the process of teleportation, which includes hover, jump, fade, and portal.
|
[Poster-10] Surface vs Motion Gestures for Mobile Augmented Reality
Ze Dong, Jingjing Zhang, Robert Lindeman, Thammathip Piumsomboon
Abstract
Surface gestures have been the dominant form of interaction for mobile devices with touchscreen. From our survey of current consumer mobile Augmented Reality (AR) applications, we found that these applications also adopted the surface gestures metaphor as their interaction technique. However, we believe that designers should be able to utilize the affordance of the three-dimensional interaction space that AR provides. We compared two interaction techniques of surface and motion gestures in a Pokémon GO-liked mobile AR game that we developed. Ten participants experienced both techniques to capture three sizes of Pokémon. The comparison examined the two types of gestures in terms of accuracy, game experience, and subjective ratings on goodness, ease of use, and engagement. It was found that the motion gesture provided better engagement and game experience, while the surface gesture was more accurate and easier to use.
|
[Poster-11] Temporal Manipulation Interface of Motion Data for Movement Observation in a Personal Training
Natsuki Hamanishi, Jun Rekimoto
Abstract
In this paper, we propose a observation method to easily distinguish the temporal changes of three-dimensional (3D) motions and its temporal manipulation interface. Conventional motion observation methods have several limitations when observing 3D motion data. Direct Manipulation (DM) interface is suitable for observing the temporal features of videos. Besides, it is suitable for daily use because it does not require learning any special operations. Our aim is to introduce DM into 3D motion observation without losing these advantage by mapping temporal changes into the specific vector in 3D space in the real-space.
|
[Poster-12] Towards a Specification Language for Spatial User Interaction
Khadidja Chaoui, Sabrina Bouzidi-Hassini, Yacine Bellik
Abstract
Spatial interactions have a great potential in ubiquitous environments. Physical objects, endowed with interconnected sensors, cooperate in a transparent manner to help users in their daily tasks. In our context, we qualify an interaction as spatial if it results from considering spatial attributes (location, orientation, speed...) of the user's body or of a given object used by her/him. According to our literature review, we found that despite their benefits (simplicity, concision, naturalness...), spatial interactions are not as widespread as other interaction models such as graphical or tactile ones. We think that this fact is due to the lack of software tools and frameworks that can make the design and development of spatial interaction easy and fast. In this paper, we propose a spatial interaction modeling language named SUIL (Spatial User Interaction Language) which represents the first step towards the development of such tools.
|