Microsoft Mesh for Building Mixed Reality Apps Highlighted at Ignite

The Microsoft Ignite Day 1 keynote presentations were heavy with talk about Microsoft Mesh, a new Microsoft Azure-based platform for building "cross-platform mixed reality apps" for multiple participants, according to a Tuesday announcement.

"Microsoft Mesh connects the physical and the digital worlds, allowing us to transcend the traditional boundaries of space and time," said Microsoft Technical Fellow Alex Kipman during the keynote address, which featured an underwater background equipped with circulating fish, plus Cirque du Soleil stage-design concepts.

Arriving in Coming Months
Availability of the Mesh Platform wasn't precisely described. Some of its capabilities, which feature artificial intelligence (AI) enhanced aspects, are still months away, according to another Microsoft announcement.

"The Microsoft Mesh platform will in coming months offer developers a full suite of AI-powered tools for avatars, session management, spatial rendering, synchronization across multiple users and holoportation to build collaborative solutions in mixed reality," the announcement stated.

"Holoportation" refers to scanning objects in three dimensions, such as persons, and importing them into a shared virtual space.

Some of the technical challenges for developers will be aided in Microsoft Mesh by its AI capabilities. Microsoft listed those capabilities as "immersive presence, spatial maps, holographic rendering and multiuser sync."

Avatars in distinct forms and holoporations are used to address immersive presence issues across devices with Microsoft Mesh. Spatial maps are used to anchor objects in an environment, and Microsoft claims its approach is "orders of magnitude more accurate than GPS." Holographic rendering is designed to be either local or cloud-connected to achieve the best image fidelity. Lastly, Microsoft deals with latency issues in collaborative sessions with its multiuser sync solutions.

Microsoft Mesh developers will get "out-of-the-box avatars" that they can use. It's also possible to capture a three-dimensional image of a person using a custom camera, such as Azure Kinect or Mixed Reality Capture Studio, which requires using so-called "outside-in sensors."

Developers can use C++, C# or Unity to develop these so-called "Mesh-enabled apps" via a software development kit. Microsoft is planning to add language support for "Unreal, Babylon, and React Native" in the "coming months."

Mixed Reality Issues
Microsoft shipped HoloLens 2, its mixed reality headset, more than a year ago, along with Dynamics 365 apps for industrial applications. However, the development of applications for mixed reality headsets has lagged.

Issues include representing people with sufficient realism and keeping a hologram stable in a shared virtual space. Synchronizing the expressions of people spread across geographic locations has been another challenge. Support for high-fidelity three-dimensional models using current file formats has been another limitation.

Microsoft Mesh is an attempt to overcome those limitations for developers building HoloLens Mesh apps for commercial release. The platform also will support partner-built apps.

Microsoft Mesh uses Azure Active Directory and Microsoft accounts for user authentications. In a future release, Microsoft is planning to add Microsoft Teams and Microsoft Dynamics 365 support. Those applications will have underlying Microsoft Graph support for surfacing metadata and content. 

Microsoft is claiming that Microsoft Mesh will be supported on devices of all types, and not just on augmented reality or mixed reality headsets. However, users of mobile devices, as well as Macs and PCs, just will be able to see two-dimensional images.

About the Author

Kurt Mackie is senior news producer for 1105 Media's Converge360 group.


comments powered by Disqus

Subscribe on YouTube