Posters and Demos

YouTube Playlist for Posters YouTube Playlist for Demos Proceedings on ACM DL

Poster/Demo Awards

All posters and demos were considered together when determining the Best Poster/Demo award and Best Poster/Demo Honorable Mention. Paper awards are listed on the paper program page.

  • Best Poster/Demo: Substituting Teleportation Visualization for Collaborative Virtual Environments. Santawat Thanyadit, Parinya Punpongsanon, Thammathip Piumsomboon, Ting-Chuen Pong
  • Best Poster/Demo Honorable Mention: Mixed Reality Spatial Computing in a Remote Learning Classroom. John Akers, Joelle Zimmermann, Laura Trutoiu, Brian Schowengerdt, Ira Kemelmacher-Shlizerman

Friday, 30 October (12:00 pm – 1:00 pm), Eastern Time

[Poster-01] A First Pilot Study to Compare Virtual Group Meetings using Video Conferences and (Immersive) Virtual Reality
Frank Steinicke, Annika Meinecke, Nale Lehmann-Willenbrock

Abstract
Face-to-face communication has evolved as most natural means for communication. However, virtual group meetings have received considerable attention as an alternative for allowing multiple persons to communicate over distance, e. g., via video conferences or immersive virtual reality (VR) systems, but they incur numerous limitations and challenges. In particular, they often hinder spatial perception of full-body language, deictic relations, or eye-to-eye contact. The differences between video conferences and immersive VR meetings still remain poorly understood. We report about a pilot study in which we compared virtual group meetings using video conferences and VR meetings with and without head-mounted displays (HMDs). The results suggest that participants feel higher sense of presence when using an immersive VR meeting, but only if an HMD is used. Usability of video conferences as well as immer- sive VR is acceptable, whereas non-immersive VR without HMD was not acceptable.

[Poster-04] Being Part of the Swarm: Experiencing Human-Swarm Interaction with VR and Tangible Robots
Hala Khodr, Ulysse Ramage, Kevin Kim, Arzu Guneysu, Barbara Bruno, Pierre Dillenbourg

Abstract
A swarm is the coherent behavior that emerges ubiquitously from simple interaction rules between self-organized agents. Understanding swarms is of utmost importance in many disciplines and jobs,but hard to teach due to the elusive nature of the phenomenon,which requires to observe events at different scales (i.e., from different perspectives) and to understand the links between them. In this article, we investigate the potential of combining a swarm of tangible, haptic-enabled robots with Virtual Reality, to provide a user with multiple perspectives and interaction modalities on the swarm, ultimately aiming at supporting the learning of emergent behaviours. The framework we developed relies on Cellulo robots and Oculus Quest and was preliminarily evaluated in a user study involving 15 participants. Results suggests that the framework effectively allows users to experience the interaction with the swarm under different perspectives and modalities.

[Poster-05] BUDI: Building Urban Designs Interactively. Can Spatial-Based Collaboration Be Seamless?
Xi Sun, Matthew Plaudis, Tianming Wei, Yvonne Coady

Abstract
BUDI (Building Urban Designs Interactively) is an integrated 3D visualization and remote collaboration platform for complex urban design tasks. Users with different backgrounds can remotely engage in the entire design cycle, improving the quality of the end result. In BUDI, a virtual environment was designed to seamlessly expand beyond a traditional two-dimensional surface into a fully immersive three-dimensional space. Clients on various devices connect with servers for different functionalities tailored for various user groups. A demonstration with a local urban planning use-case shows the costs and benefits of BUDI as a spatial-based collaborative platform. We consider the trade-offs encountered when trying to make the collaboration seamless. Specifically, we introduce the multi-dimensional data visualization and interactions the platform provides, and outline how users can interact with and analyze various aspects of urban design.

[Poster-07] Mixed Reality Spatial Computing in a Remote Learning Classroom
John Akers, Joelle Zimmermann, Laura Trutoiu, Brian Schowengerdt, Ira Kemelmacher-Shlizerman

Abstract
We present a case study on the use of mixed reality (MR) spatial computing in a fully remote classroom. We conducted a 10-week undergraduate class fully online, using a combination of traditional teleconferencing software and MR spatial computing (Magic Leap One headsets) using an avatar-mediated social interaction application (Spatial). The class culminated in a virtual poster session, using Spatial in MR to present project results, and we conducted a preliminary investigation of students experiences via interviews and questionnaires. Students reported that they had a good experience using MR for the poster session and that they thought it provided advantages over 2D video conferencing. Particular advantages cited were a stronger sense that they were in the presence of other students and instructors, an improved ability to tell where others were directing their attention, and a better ability to share 3D project content and collaborate.

[Demo-01] Augmented Reality for Self-Organized Logistics in Distribution
Dragos Datcu, Jelmer Winkel

Abstract
We have developed an augmented-reality (AR) based system which keeps track of events during the parcel handling process for last-mile logistics. The system can retrieve and highlight in AR on user's smartphone, the location of parcels in large piles of parcels. A camera array automatically detects the parcels placed manually on the shelf by an operator. New parcels are scanned and parcel fingerprints are generated semi-automatically. The system can detect and track the known parcels by fingerprint and can further highlight the location of the parcel using 3D visual clues, directly on the smartphone of the operator.

[Demo-02] Interfacing with Sensory Options Using a Virtual Equipment System
Powen Yao, Vangelis Lympouridis, Tian Zhu, Michael Zyda

Abstract
We envision the development of a novel Virtual Equipment System to replace existing 2d Interfaces in virtual reality with a set of embedded and distributed input devices. We have built a prototype that takes advantage of the user's spatial awareness, offering them a set of virtual equipment relevant to their sensory organs. It allows the users to quickly access a suite of intuitively designed interfaces using their spatial and body awareness. Our Virtual Equipment System can be standardized and applied to other extended reality devices and frameworks.

[Demo-03] Punch Typing Alternative Method for Text Entry in Virtual Reality
Powen Yao, Vangelis Lympouridis, Tian Zhu, Michael Zyda, Ruoxi Jia

Abstract
A common way to perform data entry in virtual reality remains using virtual laser pointers to select characters from a flat 2D keyboard in 3-dimensional space. In this demo, we present a data input method that takes advantage of 3D space by interacting with a keyboard with keys arranged in three dimensions. Each hand is covered by a hemisphere of keys based on the QWERTY layout, allowing users to type by moving their hands in a motion similar to punching. Although the goal is to achieve a gesture more akin to tapping, current controllers or hand tracking technology doesn't allow such high fidelity. Thus, the presented interaction using VR controllers is more comparable to punching.

[Demo-04] Tangible VR: Traversing Space in XR to Grow a Virtual Butterfly
Jiaqi Zhang, Brenda Lopez Silva

Abstract
Immersive reality technologies have been widely utilized in the area of cultural heritage, also known as Virtual Heritage. We present a tangible Virtual Reality (VR) interaction demo that allows users to freely walk in the physical space while engaging with digital and tangible objects in a "learning area". The space setup includes stations that are used symbiotically in the virtual and physical environments, such setup defines consistency throughout the ex- perience. With this method, we enhance the immersive learning experience by mapping the large virtual space into a smaller physi- cal place with a seamless transition.

Sunday, 1 November (8:30 am – 9:30 am), Eastern Time

[Poster-02] A framework for scalable tracking of physical objects to enhance immersive prototyping for design thinking
Pak Ming Fan, Ting-Chuen Pong

Abstract
Design thinking is regarded as one of the important problem-solving skills that include a prototyping stage for early testing the feasibility of an idea. With the advancement in technologies, virtual reality is being applied to create immersive experiences in design prototyping. While tracking physical objects and visualizing the tracking in the virtual world could enhance the immersive experiences, existing tracking devices are not easy to set up and might not scale cost-effectively in multi-user scenarios. The aim of this study is, therefore, to propose a framework for using simple and low-cost hardware for tracking physical objects in a multi-user environment. A sample implementation along with demonstrations on the tracking are also presented.

[Poster-03] Augmented Reality could transform last-mile logistics
Jelmer Winkel, Dragos Datcu, Paul Buijs

Abstract
The rapid growth of e-commerce is challenging parcel delivery companies to meet the demand while keeping their costs low. While drivers have to meet unrealistic targets, the companies deal with a high staff turnover percentage for the last-mile. Consequently, with every turnover, the critical experience for the last-mile is lost. In this paper, we introduce an augmented-reality based system that keeps track of events during the parcel handling process for last-mile logistics. The system can register, track parcels inside the truck to the driver and project the location in augmented form. The system, which we have integrated into a mobile application, potentially reduces processing times and accelerates the training for new drivers. We discuss first-hand observations and propose further research areas. Field-tests are currently under development to verify whether the system can operate in a real-life environment.

[Poster-06] Investigating the Effectiveness of Locked Dwell Time-based Point and Tap Gesture for Selection of Nail-sized Objects in Dense Virtual Environment
Shimmila Bhowmick, Ayaskant Panigrahi, Pranjal Borah, Pratul Kalita, Keyur Sorathia

Abstract
In immersive VR environments, object selection is an essential interaction. However, current object selection techniques suffer from issues of hand jitter, accuracy, and fatigue, especially to select nail-size objects. Here, we present locked dwell time-based point and tap, a novel object selection technique designed for nail-size object selection in a dense virtual environment. The objects are within arm's reach. We also compare locked dwell time-based point and tap with magnetic grasp, pinch and raycasting. 40 participants evaluated the effectiveness and efficiency of these techniques. The results found that locked dwell time-based point and tap took significantly less task completion time and error rate. It was also the most preferred and caused least effort among all the techniques. We also measured easy to use, easy to learn and perceived naturalness of the technique.

[Poster-08] Spatial Scale Perception for Design Tasks in Virtual Reality
Jingjing Zhang, Ze Dong, Robert Lindeman, Thammathip Piumsomboon

Abstract
We present a user study exploring spatial scale perception for design tasks by simulating different levels of eye heights (EHs) and inter-pupillary distances (IPDs) in a virtual environment. The study examined two levels of spatial scale perception of two-year-old children and adults for a chair scale estimation task. This was to provide appropriate perspectives to enable a suitable estimation of the virtual object scale for the target group during the design process. We found that the disparity between the perspective taken and the target user group had a significant impact on the resulting scale of the chairs. Our key contribution is in providing evidence to support that experiencing different spatial scale perception in VR has the potential to improve the designer's understanding of the end-user's perspective during the design process.

[Poster-09] Substituting Teleportation Visualization for Collaborative Virtual Environments
Santawat Thanyadit, Parinya Punpongsanon, Thammathip Piumsomboon, Ting-Chuen Pong

Abstract
Virtual Reality (VR) offers a boundless space for users to create, express, and explore in the absence of the limitation of the physical world. Teleportation is a locomotion technique in a virtual environment that overcomes our spatial constraint and a common approach for travel in VR applications. However, in a multi-user virtual environment, teleportation causes spatial discontinuity of user's location in space. This may cause confusion and difficulty in tracking one's collaborator who keeps disappearing and reappearing around the environment. To reduce the impact of such issue, we have identified the requirements for designing the substituted visualization (SV) and present four SV of the collaborator during the process of teleportation, which includes hover, jump, fade, and portal.

[Poster-10] Surface vs Motion Gestures for Mobile Augmented Reality
Ze Dong, Jingjing Zhang, Robert Lindeman, Thammathip Piumsomboon

Abstract
Surface gestures have been the dominant form of interaction for mobile devices with touchscreen. From our survey of current consumer mobile Augmented Reality (AR) applications, we found that these applications also adopted the surface gestures metaphor as their interaction technique. However, we believe that designers should be able to utilize the affordance of the three-dimensional interaction space that AR provides. We compared two interaction techniques of surface and motion gestures in a Pokémon GO-liked mobile AR game that we developed. Ten participants experienced both techniques to capture three sizes of Pokémon. The comparison examined the two types of gestures in terms of accuracy, game experience, and subjective ratings on goodness, ease of use, and engagement. It was found that the motion gesture provided better engagement and game experience, while the surface gesture was more accurate and easier to use.

[Poster-11] Temporal Manipulation Interface of Motion Data for Movement Observation in a Personal Training
Natsuki Hamanishi, Jun Rekimoto

Abstract
In this paper, we propose a observation method to easily distinguish the temporal changes of three-dimensional (3D) motions and its temporal manipulation interface. Conventional motion observation methods have several limitations when observing 3D motion data. Direct Manipulation (DM) interface is suitable for observing the temporal features of videos. Besides, it is suitable for daily use because it does not require learning any special operations. Our aim is to introduce DM into 3D motion observation without losing these advantage by mapping temporal changes into the specific vector in 3D space in the real-space.

[Poster-12] Towards a Specification Language for Spatial User Interaction
Khadidja Chaoui, Sabrina Bouzidi-Hassini, Yacine Bellik

Abstract
Spatial interactions have a great potential in ubiquitous environments. Physical objects, endowed with interconnected sensors, cooperate in a transparent manner to help users in their daily tasks. In our context, we qualify an interaction as spatial if it results from considering spatial attributes (location, orientation, speed...) of the user's body or of a given object used by her/him. According to our literature review, we found that despite their benefits (simplicity, concision, naturalness...), spatial interactions are not as widespread as other interaction models such as graphical or tactile ones. We think that this fact is due to the lack of software tools and frameworks that can make the design and development of spatial interaction easy and fast. In this paper, we propose a spatial interaction modeling language named SUIL (Spatial User Interaction Language) which represents the first step towards the development of such tools.

Posters, Demos and Interaction Chairs

poster@sui.acm.org

Kangsoo Kim

University of Central Florida, USA

Isaac Cho

North Carolina A&T State University, USA

Tham Piumsomboon

University of Canterbury, Christchurch, New Zealand