Mixing Realities for Collaborative Play

Six trends in Collaboration: Mixed Reality Production Design & Development

At conferences and events I recently asked a number of virtual engagement pioneers and media producers to define mixed reality (MR) for their clients and students. Definitions are diverse and funny like the people giving these insights in their videos, photos and quotes.

Six trends emerged as keys to deeper collaboration in MR: designing for social, live, integrated interactions with shared spatial engagement that creates multimodal play for wider accessibility.

Definitions of reality are always varied and fascinating. No two definitions are close to being alike. In MR consensus falls into product lines. Holograms in the living room may seem fun but is it collaborative, useful or interesting?

To see this emerging field across borders, I stepped back to define the field differently: What realities are we moving toward and why?

Is Mixed Reality primarily a live experience with virtual work and play?

This type of linear “reality” scale encompasses some but not all MR experiences as each of these terms is redefined, again

Mixed Reality is discussed as the future of many fields such as surgical medicine and other mission-critical collaborations with expert engagement, yet no two experts agree on what constitutes a mixed reality experience.

As I asked VR/MR/AR leaders to define the field my question evolved:

MR is found in real-time surgery, training and design yet very few people agree on how to share a collaborative reality. Mixed reality often refers to social experiences, woven together and integrated into a shared view, often featuring live media that spans across platforms, space and physicality. Social, spatial, live collaboration can be difficult to integrate and make accessible — and we are at a pivotal time in network development for interactive media to reach new audiences. Artists are starting to shape new realities together across borders in branded and unbounded play.

When we create at the edges of new realities, we are also recreating ourselves and our culture with new eyes. Tiltbrush dance parties are just the beginning. Sharing new perspectives in cohesive, fun and meaningful ways is a part of the design challenge for mixed reality producers. Identifying tools and modalities that deepen interactions over time for playful engagement and problem-solving is part of the current interaction and production design field-building challenge. MR may be the most malleable, mercurial media imaginable.


Not all AR or VR is social, yet mixed reality is usually a shared or social experience that blends some aspects of real and virtual space. Cameras or live chat are often used for real world feedback within mixed reality experiences. At demos we are seeing mixed reality playspaces with a mix of AR + VR connections where designers or teachers can experience and collaborate on what students are creating inside VR design programs. Conversation and engagement happen inside the shared view and outside in physical space, providing an easier way to host global conversations that build consensus.

The history of Mixed Reality live event streaming into Second Life a decade ago (photos by Josephine Dorado @funksoup)

Training and education has historically been one of the best uses of mixed reality experiences. Starting two decades ago with livestreaming into virtual 3D places, the social side of mixed reality and larger VR social play is now blooming into a series of new platforms for open or closed development, from High Fidelity and Project Sansar to Social VR baked into the Oculus ecosystem by Facebook that looks like a video game version of an avatar self. Any conference or panel event can be produced in virtual experiences but very few VR experiences choose to be full mixed reality interactions with feedback loops. This is changing with holographic design capacity.

Physical touch, movement and physical presence adds to rich social connection. Embodiment in social experience is a regular topic of conversation at design conferences and academic events as perspective and ability to converse determines the richness of the social experience. Engagement and play are closely tied to connections made through the avatar or directly into the scene in first person POV. Relational dynamics taking place awkwardly at mixed reality events become rich fodder for experimentation.

Connection dynamics remain glitchy in many MR spaces and tech hurdles can create social barriers to deeper human collaboration.

Discussions of empathy, comprehension and collaboration in social realities circle back to interaction design and perspective as key to delivering any media, story or ongoing narrative play. POV and clickability changes agency. Instead of focusing on tech and capacity, the question for designers changes:

What action is most crucial to empower? What interaction drives it?

How do we engage this action? What tools foster active collaboration?

While the level of global social discourse can be deep and wide, social cues and exchanges often appear cartoonish in many platforms due in part to a wide variety of input devices and controller orientations. Chat and voice are possible but passing objects and manipulating items together remains awkward in social experiences. Avatar movements may include hands holding controllers while some social and live event spaces lack inputs like notepads or additional displays to enable deeper collaboration and idea exchange.

Truer touch and sharing hand-to-hand in social spaces will take time as controllers, processing power and graphic capacities improve.

Unique groups of interactive journalists are working on live MR integration with real stories and social experiences for all types of headsets and mobile platforms. Mixed reality live and social storytelling will change how we interact with news and current events. I am working with producers to design channels and networks for global interactive content that blend sensors, blockchain asset management and interactivity across all types of media for experiences remixable into new worlds and meet with many producers creating stellar content that transcends what we think of as VR.

Mixed Reality Labs such as ICT at USC push the barriers of devices for social storytelling, such as the film narratives seen in movies such as The Congress where an actor is scanned for infinite new realities and worlds. Science fiction and social experiences are now blending in a handful of social VR live experiences where concerts or live broadcasts combine with a conversation or dialogue to create new forms of mixed reality. Live & Social VR events are shifting how virtual experiences become fully living and engaged mixed reality adventures that participants can share and shape together.


Ten years ago avatars produced mixed reality events where a live event and a virtual experience are combined for social interactions. The delivery tool — mobile or virtual world, at the time, was determined by the intended audience and market for best distribution. The key was to combine experiences to allow for some flow of information or dialogue between multiple global spaces.

Mixed Reality events mixing avatar performance and livestreaming a decade ago in Second Life by Josephine Dorado

After years of dance parties, live events and MR virtual shows an extended community formed of mixed media artists and technologists who enjoy producing rich dialogue and experiences across worlds. Inquiry, machinima and ongoing conversation became tools for this style of virtual engagement with MR narratives. Many livestreaming and virtual world artists from a decade ago are still mixing realities, creating live media content as an extension of telepresence or co-location to be in more than one place at the same time.

Ron T. Blechner: A virtual and physical space occupying the same space, either by augmented reality technology or by bi-directional cameras / telepresence between two spaces.

Health & Cancer Research Mixed Reality Conversation with Nonprofit Commons — @TechSoup event with Susan Tenby — I’m focused in the corner driving avatars on screen while livestreaming the event to virtual guests for integrated dialogue.

Today’s live VR and mixed reality takes the form of concerts, events with the previous POTUS, gameplay arcades and live immersion domes with a mix of media. As wearables become more common in play the interactive and live playspace connects sensors with online gameplay sites like Twitch and Steam and VR or AR opportunities for live play and direct creation with others.

VRTIFY is live music video composition where you can create the room, the experience and remix your sounds

Music and art composition in collaboration with live content integration is the next generation of MR play in museums and on stage, led by teams like VRTIFY and Google with Tiltbrush MR dance parties.

Live music experiences will be growing exponentially thanks to amazing collaborators like my colleagues at Digital Raign who offer retreats for producers and artists, now launching an airstream production trailer for live VR interviews at festivals. In the past few months I have met hundreds of recording artists excited to play with live music and VR/MR engagement with their audiences as a new way to make media together.

This excitement for interactivity will infuse the next generation of media networks with tons of replayable content. As graphic quality improves, the uncanny valley of lifelike MR human artist dolls evolves to be more interesting and weird for designers.


Now that artists can sit on our table and create with us, what’s next?

Bouncing creatures in our living room cheering along with us?

Not all current mixed reality translates to a social experience. Young Conker in the HoloLens is live and aware but not necessarily shared with anyone beyond the HoloLens user. This AI-driven character learns your space.

Spatial reasoning and manipulation is the next step in mixed reality as room-sized sensors become more common across platforms. The integration of spatial reasoning, cameras, sensors and live interactions can change the environment and the nature of collaboration and play in the environment.

Leslie Oliver Karpas VR/AR + Machine Vision = Mixed Reality … When a virtual environment is positionally aware of the content and users there within and can manage the interactions between the virtual and actual.

To share spatial information in collaboration a shared perspective and hand-off is essential in the workspace. Sensors are key and linking these objects is complex. This is where the tech gets tricky:

Domes and AR gaming are already leading the way in spatial play. Collaborative spatial engagement will continue to evolve rapidly in the next few years with the introduction of smart glasses, arcades and light field technologies from companies like Magic Leap, Apple and Meta.

Mixed Reality Arcades such as the new IMAX VR experience zone in LA feature real world props and a blending of physical and virtual objects in space. In arcades and other group play your devices may buzz or vibrate to add to the sensory experience while full body suits using technology like the Subpac transmit sound through your body. This type of physical MR play will become more popular in amusement parks, malls and now hotels thanks to ExitRealityVR and other experience producers. We are currently prototyping a number of experiences in our special effects shop in downtown Los Angeles for parks, theatres and new media releases worldwide.


Mixing realities may include transcending traditional screens, headsets and known displays or devices. New forms of mixed reality can be manipulated as holograms in space and time as designers learn to collaborate and change physical and virtual objects with hands instead of controllers and pens. By 2019 these immersive playspaces will be common at home and in public.

Design collaboration for play has expedited movements to MR. Sunny Dhillon’s published this working definition of Mixed Reality after VRDC:

In Mixed reality you see the virtual world with real world items, which are laden with LED trackers that cameras can track and bring into the VR world. Mixed Reality arcades blend the physical and virtual worlds together, in controlled ways.

This type of multimodal play that combines familiar objects like bricks and brushes with headsets, wearables, magic wands and other peripherals will change the way we work and play together. Collaboration is completely new in AR and MR where the physical world becomes a canvas for new possibilities. Hands become tools and bodies become a part of the process. Design collaborations are far richer when overlays are shared — and this technology is coming soon to home gaming systems.

Second HoloLens = XBOX? Jumping to V3 in 2019 with Mixed Reality XBOX coming in 2018 according to GDC interviews

Quin Mark Cabalquinto: Traditional Mixed Media uses distinct mediums such as charcoal, ink, paint, photography, and poetry to create a single visual work.
Mixed Reality incorporates a variety of distinct mediums and media such as projection, lighting, sound, AR and VR to create a cohesive experience.
Whereas Mixed Media is a fixed passive work, Mixed Reality can be interactive and reveal distinctly different experiences for each participant.

Live 360 content with holographic multimodal media allows for a new type of mixed reality mediamaking where cameras can be installed and shared within a wide variety of platforms from Facebook and YouTube to domes and VR headsets. Mixed Reality can transport you anywhere immediately, such as Intel’s greenscreen holodeck lounge at GDC 2017 streaming live to YouTube.

Blueprint Reality demos live MR construction design in a greenscreen cave at the UploadVR VIP Mixer at GDC 2017

Sensors or cameras connect the entire experience, integrating information from full body suits or other wearables. SmartSuits demoed at VRDC this year by RoKoKo feature dozens of sensors sharing real time data for live dancing in VR, games, screens or other devices in multimodal performances and experiences. With tactile play and suits we are one step closer to live dance parties in mixed reality, available anywhere and easily playable together. Add Tiltbrush or MicrodoseVR and you have a full sensory creation experience.

A dancer wearing the RoKoKo SmartSuit dances for live MR while a DJ spins at theUploadVR VIP event at GDC

Thousands of developers are getting their hands on SDKs and open platforms as new types of mixed reality demos start popping up at every event in town. Technologists tend to focus on the affordances of products while storytellers often start with purpose and designers work through the goals of interactions to effectively drive interactive participation across channels and markets. Teams that include a mix of business, art and media, tech and impact are often most effective at producing quality mixed reality experiences.

Now that cameras are lower cost and easier to obtain, current content gaps in AR & VR will rapidly be filled with live mixed reality media events. Exponential growth of independent, low-cost production drives easier and faster MR development and delivery across platforms, encouraging designers to rethink interactivity and engagement.

High Fidelity is demonstrating rapid live event generation for groups of 100+ as a future solution for multimodal & social MR concurrency available to the Vive, Rift or most VR/AR headsets while large scale streaming global events are happening in domes & VR platforms. While sight and sound often work together to link environments, individual results vary widely. Social experiences across VR platforms remain awkward, as some interactions and elements of the experience do not translate well and standards are still being developed for cross-platform experiences.

There are a number of distribution and delivery silos to overcome as MR market distribution follows hardware app store models. Few platforms offer deep interactivity with content makers and stories, a trend changing as more people develop integrated content for arcades, HTC Vive or AR experiences instead of uploading passive videos to YouTube or Facebook 360 Live.


Mixed Reality displays are now integrated in sleeker wearables such as smart glasses by ODG for military and police use as well as professional training applications. Data becomes an overlay for collaborative work in these shared visual environments that can also be linked for communication, lighting, power and other rugged field uses (see the helmets our team engineered for Tron for example — integrating light, sound, visual overlays and temperature control). For police and military, live MR integration changes information exchanges and interactions in the real world. Mixed reality markets may thrive outside of consumer VR or AR product lines as R&D pioneers such as ODG push the boundaries of senses with data integration.

MR may include mobile, VR, AR and livestreaming to laptops and wearables like helmets or glasses. There’s an art to this level of complexity in technology integration that brings together experience zones for more seamless conversations. In most cases the mixed reality producer remains bound by signal strength, firewalls and data regulations. Signal pairing without wires or external interference remains challenging for most MR experiments.

Glitches abound and most anything can go wrong in MR. Time delays, sound quality, code and computing capacity play a role in production quality along with integrity of capture, video processing and platform for distribution. Interaction design for VR & MR is a developing art as we learn to work differently in 360 and live media production design. To bridge gaps in knowledge and coverage, machinima has been a common form of mixed reality media sharing for 15+ years, such as this global architectural partnership to redesign the public commons in Cairo and the US:

Mixed reality global diplomacy for interactive dialogue between platforms, worlds and communities has been enabled in virtual environments for over a decade. Now with the rapid rise of cheap headsets and mobile VR this field is poised to grow exponentially over the next decade as the power to beam right into any global meeting is in your hand and smartphone. This immediacy of integration can change the balance of power as distant villages or hospital beds are no longer barriers to deep interactions with a wide mix of global networks and opportunity spaces.

Opening the door to REAL engagement is key for deeper interactivity.

Metaverse Roadmap definitions from 2005 are now mixing wildly in MR experimentation with sensors, sims, alternate identities and real world augmentation

Integrated virtual combinations of the “metaverse” of mixed reality technologies have been floating in academic circles for over two decades. At Stanford in 2005 I sat in on early discussions for the Metaverse Roadmap, developed as a field guide to show emerging trends in VR, AR and other types of live, shared media. These fields are now more challenging to separate. From the 1994 paper on Mixed Reality Displays by Milgram & Kishino:

Mixed Reality (MR) visual displays, a particular subset of Virtual Reality (VR) related technologies that involve the merging of real and virtual worlds somewhere along the “virtuality continuum” which connects completely real environments to completely virtual ones.

The physical LIVE to VIRTUAL connection here is highlighted as the key to merging worlds for a quality experience. Six classes of hybrid MR display environments were identified in 1994 — we have expanded now to include dozens of potential integrations. To explore this virtuality continuum and connect more with real environments and experts in this field look at training as the killer app market for mixed reality collaboration.

How do we combine MR/VR/AR for ideal educational discovery with guides? There are many different types of use cases with AR smart glasses being a common first step to MR for training. From the 1994 paper above:

Probably the best known of these is Augmented Reality (AR), which refers to all cases in which the display of an otherwise real environment is augmented by means of virtual (computer graphic) objects. The converse case on the virtuality continuum is therefore Augmented Virtuality (AV).

Augmented Virtuality is close to some modern definitions of MR, as the virtual environment can now be augmented by others, as we see in AR headsets like Meta 2 or HoloLens or social VR platforms like High Fidelity or Sansar. These platforms allow players to create new types of environments that can be generative, emergent and constantly in flux with play. As this form of textural play becomes anchored to the real world it becomes a form of augmented virtuality or MR collaboration that can be exported and translated to 3D printers or architectural designs.

Integration with the real world of possibilities may be the greatest frontier for mixed reality producers.


Around this world and beyond, mobile and virtual experiences are becoming our primary access point to a world of possibilities. We see potential futures unfold. An invitation to participation is especially meaningful for marginalized and de-funded communities such as GLBTQ artists, disabled veterans and displaced indigenous tribes, three of the first communities to find great value in virtual experiences. Reshaping identity and community is easier done in virtual spaces than in reality.

Accessibility for ability levels and sensory input needs can now be hacked and woven together in wild new ways for most anyone to participate and expand their senses. Add AI and sensors and see what happens! To expand capacity, open the door to engagement and ask for deeper interactions over time. These systems can learn and grow with us.

To support inclusion and accessibility we have seen the “Cognitive Immersion” movement grow where AI partner with sensors and the IoT around us to create awarenesses that help us filter potential decision points in our real and imagined environments. Kyle Li posted this great definition at the Immersive Storytelling Symposium at New School in NYC in February 2017:

Kyle Li and panelists at Parsons/New School in NYC at Immersive Storytelling: Exploring AI, Immersion & Perspective

Interactivity with physical objects is often key to mixed reality design. Sometimes these objects are smart or connected to other smarter systems. New workflows and collaborations are being pioneered for remote work with AI & mixed reality allowing for global collaboration with different perspectives, sharing in real time what is happening on their end of the process. This changes our access to physicality and our ability to collaborate, especially when physical boundaries restrict the engagement and play.

Mixed reality can transcend borders and accessibility needs.

It is possible to have a rich virtual experience while not being able to move physically in the real world. Not all of mixed reality is designed for the visual interactive landscape as we see other senses like touch and smell beginning to expand interactivity in the virtual experience.

The best VR experience is often said to be Notes on Blindness in the Within app as a way for the seeing to understand the life of the blind. VR and AR together provide unique MR affordances for people who cannot leave their homes or may need different types of interactions in order to succeed. Adding other senses like touch or smell add layers to a reality-crafting palette.

While most VR plays with vision and sound as the primary input, immersive producers play with touch, smell and other sensory inputs like temperature, vibration or breath to enrich mixed reality experiences

Jacki Morie: It’s not how others define it or even how you define it. It’s what you do that then becomes something people can put labels on.

Dr. Jacki Morie is an artist/producer creating new peripherals such as a Scent Collar to show the powerful emotional landscape of olfactory experiences as part of human-computer interaction (note: I co-own Toyshoppe Systems, a film and media FX company in LA prototyping peripherals and wearables for MR/VR/AR tours, film and transmedia projects). While blending physical senses with fabricated sensors will take a few decades to perfect, we are beginning to see wearables such as headsets, suits, gloves and jewelry being used as controllers and as primary interaction/transaction devices in public events as well as the SmartSuit replacing motion capture stages.

Mixed Reality reveals the next stage of access to expand interactive senses. This will grow with adoption of wearables and headsets. We are seeing the first steps toward accessibility across these reality landscapes.

VR movements for access and opportunity have been thriving for almost 20 years as people who were previously home-bound find community and economic opportunity in vast virtual landscapes. Mixing live, physical, social experiences for greater depth of interactivity will supercharge accessibility movements, especially as WebVR and 3D web access become more prevalent.

Recent tests in live Mixed Reality music performances are coming closer to embodied collaboration with robust access

Soon we will access mixed reality content in our living rooms, on our mobile phones and anywhere we connect. For now, most of this content is currently designed for demos with greenscreens, tech labs and able-bodied alpha adopters willing to deal with cords and awkward technology crashes. The Economist, in recent coverage on headset adoptions, noted mass market releases for as low as $30 coming soon as these glitches smooth out:

Zappar recently launched a Kickstarter campaign to offer an affordable mixed-reality kit called ZapBox which will bring mixed-reality and VR experiences to the mass market.

As the cost of VR and other peripherals also drop, Mixed Reality will be more available at home in your mobile phone and favorite headset in the coming year as the collision of live and social reality layers collide with daily work and play. This immediacy in the home and learning environment changes the nature of on-demand education, health, technical collaboration and design as the experts we once traveled to see can now appear to us anytime in our living rooms. Products like ZapBox promise to deliver something akin to the professional experience of HoloLens or Magic Leap at a price point comparable to Google’s Cardboard for VR, but it may take a few years to see which teams win for adoption and sales. So far, higher cost VR sets are selling sluggishly compared to mobile VR experiences yet this field evolves fast.

At YouTube’s Mixed Reality Lab using an HTC Vive controller to make mixed reality work in games via greenscreen to match rotation of camera in Unity. This shows how Mixed Reality is built live.

In the coming year we will see consumer releases launching from Apple, Google and Microsoft that blend MR with virtual and real world interactions. This field will be rapidly expanding with new experimentation with lower-cost interfaces and sensors available for tools like the HTC Vive.

If you have questions on mixed reality workflows, cameras for live mixed relaity production, live VR/MR for broadcast and prototyping new types of controllers, displays and interfaces send a message here to meet R&D and production teams around the world. Here’s one more quick look at various approaches to mixing realities.

I’d like to see your definitions in the comments below. Tag me @EvoHeyning if you choose to share in public.

CEO @PlayableAgency ✩ Founder @LightLodges ✩ Producer ✩ Advisor ✩ Artist ✩ Speaker ✩ Media Design ✩ Interactive Mixed Reality ✩ Strategist ✩ Consultant