Mapping out Work in a Mixed Reality Project Room

Mapping out Work in a Mixed Reality Project Room
Derek Reilly, Andy Echenique, Andy Wu, Anthony Tang, W. Keith Edwards

CHI ’15: ACM Conference on Human Factors in Computing Systems
Session: Collaborative Tables, Walls & Rooms

Abstract
We present results from a study examining how the physical layout of a project room and task affect the cognitive maps acquired of a connected virtual environment during mixed-presence collaboration. Results indicate that a combination of physical layout and task impacts cognitive maps of the virtual space. Participants did not form a strong model of how different physical work regions were situated relative to each other in the virtual world when the tasks performed in each region differed. Egocentric perspectives of multiple displays enforced by different furniture arrangements encouraged cognitive maps of the virtual world that reflected these perspectives, when the displays were used for the same task. These influences competed or coincided with document-based, audiovisual and interface cues, influencing collaboration. We consider the implications of our findings on WYSIWIS mappings between real and virtual for mixed-presence collaboration.

DOI:: http://dx.doi.org/10.1145/2702123.2702506
WEB:: https://chi2015.acm.org/

Recorded at the 33rd Annual ACM Conference on Human Factors in Computing Systems in Seoul, Korea, April 18-23, 2015

Vuforia for Mixed Reality and Eyewear

Mixed reality experiences combine AR and VR to deliver increased immersion and visualization power. Vuforia’s Mixed Reality controller provides seamless AR/VR transitions for handheld or head-worn experiences. Built-in support for most viewers – including inertial tracking and stereo rendering– means no need for a separate VR SDK. Vuforia also supports AR glasses such as the Epson Moverio BT-200, ODG R-6 and ODG R-7.

Virtual Objects as Spatial Cues in Collaborative Mixed Reality Environments: How They Shape …

Virtual Objects as Spatial Cues in Collaborative Mixed Reality Environments: How They Shape Communication Behavior and User Task Load
Jens Müller, Roman Rädle, Harald Reiterer

Abstract:
In collaborative activities, collaborators can use physical objects in their shared environment as spatial cues to guide each other’s attention. Collaborative mixed reality environments (MREs) include both, physical and digital objects. To study how virtual objects influence collaboration and whether they are used as spatial cues, we conducted a controlled lab experiment with 16 dyads. Results of our study show that collaborators favored the digital objects as spatial cues over the physical environment and the physical objects: Collaborators used significantly less deictic gestures in favor of more disambiguous verbal references and a decreased subjective workload when virtual objects were present. This suggests adding additional virtual objects as spatial cues to MREs to improve user experience during collaborative mixed reality tasks.

ACM DL: http://dl.acm.org/citation.cfm?id=2858043
DOI: http://dx.doi.org/10.1145/2858036.2858043

——

https://chi2016.acm.org/wp/

AI, Mixed Reality and the New Software Design Landscape

*UPDATE*

Based on the feedback and requests after this talk was giving at Microsoft on January 2016, I have decided to create a series of “Creative Coding in Unity” workshops which will be located on this video channel and hosted on my blog http://www.rbarraza.com

An age of more personal computing is upon us, but is our intuition around technology and design up to the challenge? In this sweeping talk across the new software design landscape, Rick Barraza explains how fundamental shifts in computation, artificial intelligence and mixed reality are transforming everything we thought we knew about design and the future of software.

Mixed Reality and the future of holograms

What does a world of mixed reality mean to you? Microsoft HoloLens came to Cannes Lions 2015 to talk holograms and breaking digital assets away from the screen and putting them into the real world.

Subscribe: http://www.youtube.com/subscription_center?add_user=canneslions

Check out our videos: http://www.youtube.com/user/canneslions/videos
Watch our playlists: http://www.youtube.com/user/canneslions/playlists
Like us on Facebook: http://www.facebook.com/cannes.lions.festival.of.creativity
Follow on Twitter: http://twitter.com/cannes_lions
Follow on Instagram: http://www.instagram.com/cannes_lions
Website: http://www.canneslions.com

Mixed Reality Study: Usage of VR and AR for Medical Training

Article here: http://realvision.ae/blog/2016/04/mixed-reality-ar-vr-holograms-medical/

This video is part of a research project for a client. With the talk of “holograms” getting widespread thanks to devices such as the Hololens, there is an interest from diverse industries such as Medical, Education and Industrial on the use of Mixed reality solutions for their visualization and training needs.

Some technical details:
This AR experience was created to stress test the feasibility of a mobile, self-contained and non tethered solution for visualization and training.

The project was created in Unity and runs on a Samsung GearVR with S6Edge phone.
The slight lag is due to very high polygon count and textures running on the S6 along with MirrorOP(a screen recording solution for android) on the same phone, in the background, transmitting the S6’s screen a PC in the next room over wifi. The picture-in-picture window above is the recording.

Outcomes so far:
– With more optimized use of 3D models and textures, a more robust solution can be realized.
– S6 will fire a thermal throttle warning, due to the demands of Camera running and high poly count objects.
– fiducial trackers/markers should completely eliminate tracking jitter and would be recommeded in a medical scenario, over more cosmetic / fancy image markers which might be ok for the advertising market.