Giter VIP home page Giter VIP logo

openxr-unity-mixedreality-samples's Introduction

page_type name description languages products
sample
OpenXR Mixed Reality samples for Unity
These sample projects showcase how to build Unity applications for HoloLens 2 or Mixed Reality headsets using the Mixed Reality OpenXR plugin.
csharp
windows-mixed-reality
hololens

OpenXR + Unity + Mixed Reality Samples

Welcome!

OpenXR-Unity-MixedReality-Samples-MainMenu

These sample projects showcase how to build Unity applications for HoloLens 2 or Mixed Reality headsets using the Mixed Reality OpenXR plugin. For more details on installing related tools and setting up a Unity project, please reference the plugin documentation on https://docs.microsoft.com/. For more details on using the Mixed Reality OpenXR plugin API, please reference the API documentation on https://docs.microsoft.com/

⚠️ NOTE : This repository uses Git Large File Storage to store large files, such as Unity packages and images. Please install the latest git-lfs before cloning this repo.

Recommended tool versions

It's recommended to run these samples on HoloLens 2 using the following versions:

  • Latest Visual Studio 2022 or 2019
  • Latest Unity 2020.3 LTS. Please double check the Unity's blocking bugs for HoloLens 2.
  • Latest Unity OpenXR plugin, recommended 1.3.1 or newer.
  • Latest Mixed Reality OpenXR Plugin, recommended 1.4.0 or newer. Please follow the latest release notes.
  • Latest MRTK-Unity, recommended 2.7.3 or newer.
  • Latest Windows Mixed Reality Runtime, recommended 109 or newer.

Sample for anchors and anchor persistence

See details in Anchor Sample Scene.

Sample for hand tracking

Sample for eye tracking

FollowEyeGaze.cs in the Interaction scene demos using Unity Feature Usages to obtain eye tracking data.

Sample for locatable camera

LocatableCamera.cs in the LocatableCamera scene demos the setup and usage of the locatable camera.

Sample for ARFoundation compatibility

Scenes ARAnchor, ARRaycast, ARPlane, and ARMesh are all implemented using ARFoundation, backed in this project by OpenXR plugin on HoloLens 2.

Sample for Azure Spatial Anchors

SpatialAnchorsSample.cs in the Azure Spatial Anchors sample project demos saving and locating spatial anchors. For more information on how to set up the Azure Spatial Anchors project, see the readme in the project's folder.

Sample for Holographic Application Remoting

AppRemotingSample.cs in the Main Menu scene demos app remoting. For more information on how to set up the Basic Sample project for App Remoting, see the readme in the project's folder.

How to file issues and get help

This project uses GitHub Issues to track bugs and feature requests. For help and questions about using this project, please use GitHub Issues in this project. Please search the existing issues before filing new issues to avoid duplicates. For new issues, file your bug or feature request as a new Issue.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

openxr-unity-mixedreality-samples's People

Contributors

devdeeprc avatar elbuhofantasma avatar hferrone avatar kbonds avatar keveleigh avatar marlenaklein-msft avatar microsoftopensource avatar satyapoojasama avatar yl-msft avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openxr-unity-mixedreality-samples's Issues

Missing Controller.Velocity on Hololens2

If I inherit from IMixedRealityTouchHandler, implement needed methods, and try to get eventData.Controller.Velocity, I will get Vector3.zero. So here _velocityVector will work only in UnityEditor. On Hololens 2 it will be equal to Vector3.zero. Installed OpenXR Plugin 1.4.1. Unity Version 2020.3.33f1
image

Unity LogError: "ConnectToPlayer is not supported, enable App Remoting feature to use this."

Hello!
I'm new to this field and when I was testing this Remoting function I always run into this "ConnectToPlayer is not supported, enable App Remoting feature to use this." error and I don't know how to solve it. I followed the instruction here and find the sample successfully connected to my HL2, but when I try to build this function into my own project (by dragging the prefabs into the scene), it didn't work. I definitely checked these two options:

Here is the information regarding the RemotingSampleScene which can connect HL2 successfully on my pc:
remotingSample
And here is the information regarding my own project:
myproject
I noticed that in my project there is no “OpenXR Runtime”, but I don't know why and how or does it have anything to do with my error.
I also checked the code but since I know so little about programming I can't possibly tell the problem.
code
Did I miss something? I think maybe it's just a minor "project settings" problem but I'm not sure and don't know where to look at.
If you need more details, please tell me and I will try to provide more information.
Thanks in advance!

Can't build for HoloLens 2 cause of a FileNotFoundException

Hi,

I'm looking for testing the OpenXR HoloLens samples you did but I m facing a FileNotFoundException :
"FileNotFoundException: Packages/com.microsoft.mixedreality.openxr/Runtime/universalwindows/x64/MicrosoftOpenXRPlugin.dll does not exist"

This is the first time I open this project and I didn't made any change on.

Unity version : 2020.3.13f1

Azure Spatial Anchors not consistently located in sample

The bug

In the ASA Sample anchors that were previously created locally (or indeed any valid ASA anchor) are only found about 30% of the time.

The first time the device attempts to find an anchor it has just saved, it almost always succeeds.

On subsequent attempts (i.e. the watcher has been deleted, the session stopped, and a new session started) it almost always fails to find the anchor. This seems to be unrelated to lighting conditions, consistency of the environment, position + orientation of the headset, or anything else. When location fails, no errors are ever logged by the SDK internals, and AnchorLocated is never called with any arguments.

To reproduce

Steps to reproduce the behavior:

  1. Build and run the ASA Sample
  2. Tap 'Start Session'
  3. Place 2 anchors by air tapping
  4. Save both anchors by air tapping in their boxes
  5. Confirm both have been saved
  6. Tap 'Stop Session'
  7. Tap 'Start Session'
  8. Observe that both anchors were located and appear as green boxes
  9. Tap 'Stop Session'
  10. Tap 'Start Session'
  11. Observe that (usually) at least one, or both of the anchors are never located
  12. Repeat steps 9-11 several times (varying the wait between stopping and restarting the session if you want to), and observe that the ability of the SDK to locate the anchors is seemingly random (and usually fails).

Expected behavior

Under reasonable environmental conditions an ASA anchor in a given location should always be found, or at least some error logging should happen against the SpatialAnchorManager's Error callback.

Setup

Unity Version: 2020.3.19f1
MRTK Version: 2.7.2
Unity OpenXR Plugin Version: 1.2.8
Mixed Reality OpenXR Plugin Version: 1.0.3.0
Windows Mixed Reality Runtime: 107.2109.10010
Device OS Version: 10.0.20348.1018
Azure Spatial Anchors SDK Version: 2.10.0-beta0014

Locatable Camera

How to pin point the position of any hologram present in camera frame while capturing image from locatable camera?

How to use the samples?

Would somebody please tell us how to import, copy, drag-drop- whatever to use this stuff? Got an empty project created following these instructions here. It links to here this repo.
My brand new supposedly MRTK ready project has a SampleScene just like it looked when Unity created the project. Obviously some steps are missing between setup and actually using MRTK or any sample.

AzureSpatialAnchorsSample - Anchors recognized only in the same app session

Unity 2021.3.14f1 + MRTK 2.8.3 on Hololens 2

I'm testing this example, azure sessions works and anchors are created and recognized between different azure sessions.
But if I kill the app and run it again no anchor is found.

I need to be able to reload all anchors and share them with other devices.

I just configured azure credentials in the spatial anchor manager component.

Am I forgetting some setting?

thank you!

UWP build error: MSB3774 Could not find SDK "WindowsMobile, Version=10.0.22621.0"

try to build samples with Unity 2021.3.23f1
configured as UWP Intel x64 , D3DProject,
Target SDK 10.0.22621.0 , Min. Platform 10.0.19041.0
build and run on local Machine
but got error in Visual Studio 2022 (17.5.4):

MSB3774 Could not find SDK "WindowsMobile, Version=10.0.22621.0"

my Visual Studio 2022 installed individual components do including :
Win10 SDK 10.0.19041.0, 10.0.20348.0
Win11 SDK 10.0.22000.0, 10.0.22621.0

Spatial graph nodes

The latest plugin release (1.4.0) exposes the ability to work with dynamic spatial graph nodes, specifically calling out interop with PV camera tracking. Hopefully this is an OK place to ask but I have a number of questions about this and about the 'spatial graph' concept in general. So far I've encountered HL2 spatial graph nodes coming from a few different places:

  • Mixed Reality QR library (static)
  • Research Mode API (dynamic)
  • Node IDs listed against spatial anchors in Map Manager in HL2 device portal (static)
  • SpatialGraphInteropFrameOfReferencePreview objects created from SpatialCoordinateSystem objects (static)

My questions are:

  • Are the other APIs which are sources of spatial graph nodes? How can they be created? I have seen no API for creating them, only for interacting with existing GUIDs.
  • What is the difference between dynamic and static nodes? Static nodes appear to move/adjust over time as dynamic ones do.
  • What is the difference between spatial graph nodes and SpatialCoordinateSystem? Are these two things equivalent? Why do some APIs seem to expose only one or the other?
  • How do spatial anchors relate to spatial graph nodes? One node often seems to map to many anchors (see Map Manager). Is a persisted spatial anchor anything more than a record of a spatial node GUID plus an offset?
  • How does one obtain a dynamic node representing the PV camera (as hinted at in 1.4.0 release notes)? I've previously used SpatialGraphInteropFrameOfReferencePreview + MediaFrameReference.CoordinateSystem + XR_MSFT_spatial_graph_bridge to obtain PV camera poses when using OpenXR. But a dynamic node sounds cleaner/better.
  • SpatialGraphInteropFrameOfReferencePreview and its peers have been "preview" for a long time. Is this stuff safe/reliable to use and will it ever be considered "not preview". Why do they exist at all, could every Windows API not just directly expose a node GUID which maps much more cleanly to OpenXR?

I would love to simplify much of my Unity code and track all these different things (anchors, QR codes, PV camera, research mode sensors etc) in a consistent manner using the SpatialGraphNode functionality in this plugin (especially now that it supports dynamic nodes) but my understanding of how everything fits together is still a bit hazy.

Can't persist anchors in editor playmode

It would help a lot as I develop off of this to reduce iteration time if I could use the holographic remoting for play mode. When I do, I get the warning at
AnchorsSample:54 "XRAnchorStore not available, sample anchor persistence functionality will not be enabled."
and the related warning at
AnchorStore:35 "LoadAnchorStoreAsync: The anchor store is not supported; either the feature is not enabled, or the related OpenXR extensions are not supported".

Is there a setup issue that can make this work?

Project validation spam errors in the console when no main camera

Hi,

I just updated to Unity 2023.2.12f1, so OpenXR updated automatically to 1.10.0.
I manually updated the microsoft OpenXR plugin to 1.10.0.

In our application, there is a root scene that load sub-root scene depending on the plateform.
The main camera is in the sub-root scene, so most of the time, I don't have a main camera open in the editor.

When opening the OpenXR Project Validation, it spam this error, because there is no main camera:

NullReferenceException: Object reference not set to an instance of an object
Microsoft.MixedReality.OpenXR.Editor.PlatformValidation+<>c.<GenerateCameraFlagsRule>b__35_1 () (at ./Library/PackageCache/com.microsoft.mixedreality.openxr@6b31efb96f12/Editor/Settings/PlatformValidation.cs:465)
Unity.XR.CoreUtils.Editor.BuildValidator.GetCurrentValidationIssues (System.Collections.Generic.HashSet`1[T] failures, UnityEditor.BuildTargetGroup buildTargetGroup) (at ./Library/PackageCache/[email protected]/Editor/ProjectValidation/BuildValidator.cs:94)
Unity.XR.CoreUtils.Editor.ProjectValidationDrawer.UpdateIssues (System.Boolean focused, System.Boolean force) (at ./Library/PackageCache/[email protected]/Editor/ProjectValidation/ProjectValidationDrawer.cs:475)
Unity.XR.CoreUtils.Editor.ProjectValidationSettingsProvider.OnInspectorUpdate () (at ./Library/PackageCache/[email protected]/Editor/ProjectValidation/ProjectValidationSettingsProvider.cs:45)
UnityEditor.SettingsWindow.OnInspectorUpdate () (at <3e46df4d8aed4ee0a868f68610838aae>:0)
UnityEditor.HostView.OnInspectorUpdate () (at <3e46df4d8aed4ee0a868f68610838aae>:0)

Error "The type or namespace name 'SceneSystem' does not exist in the namespace 'CoreServices'" at BasicSample project startup

Hello,
I downloaded the repository and opened the BasicSample project with Unity 2020.3.11f1 and these three errors showed up:

mrtk-scenesystem

I fixed the issue by adding Toolkit before the CoreServices namespace to each of the three lines of code with the error (i.e. in the SceneLoader script I modified the line
public void LoadScene(string sceneName) => CoreServices.SceneSystem.LoadContent(sceneName, LoadSceneMode.Single);
into
public void LoadScene(string sceneName) => Toolkit.CoreServices.SceneSystem.LoadContent(sceneName, LoadSceneMode.Single);, the same for the other two scripts).

After the project recompiled the code, everything worked fine!

Haptic binding not avaliable for HP Reverb G2

It looks as though the HP Reverb G2 Controller Profile from MSFT's Mixed Reality OpenXR Plugin (1.4.2 and 1.5.1) is misconfigured.

Specifically, it looks as though the profile is missing the following declarations pertaining to haptics:

Before the 'protected override void FinishSetup()' method declaration ~Line 154

/// <summary> /// A <see cref="HapticControl"/> that represents the <see cref="HTCViveControllerProfile.haptic"/> binding. /// </summary> [Preserve, InputControl(usage = "Haptic")] public HapticControl haptic { get; private set; }

Within the 'protected override void FinishSetup()' method declaration ~Line 178
`haptic = GetChildControl("haptic");'

At the end of the ActionMap declaration, usages isn't set, and the name and localized name read vibrate.

// Haptics new ActionConfig() { name = "haptic", localizedName = "Haptic Output", type = ActionType.Vibrate, usages = new List<string>() { "Haptic" }, bindings = new List<ActionBinding>() { new ActionBinding() { interactionPath = haptic, interactionProfileName = profile, }

I created a custom profile based off the HP & Microsoft profile and added the changes which resulted in functional haptics.

Invoking SkinnedMeshRenderer.BakeMesh() with tracked hand mesh in HoloLens 2 is creating an invalid mesh with OpenXR.

----Bug----
Invoking SkinnedMeshRenderer.BakeMesh() with tracked hand mesh in HoloLens 2 is creating an invalid mesh.  Rather than creating  a mesh which represent the current deformation of the hand mesh, it is creating a mesh that contains the outdated deformed mesh. We have noticed the outdated deformation is most likely represents the hand state when the HoloLens first recognized the tracked hands in its field of view (FOV) .
It is happening with MRTK version 2.8.2 and with latest version of Unity 2020.3.x and Unity 2021.3.x

------To reproduce------
Steps to reproduce the behavior:
1)Open the "BasicSample" unity project with the latest version of Unity 2020. (We used Unity 2020.3.43)
https://github.com/microsoft/OpenXR-Unity-MixedReality-Samples/tree/main/BasicSample

Use the MRTK (Foundation & StandardAssets packages) and Mixed Reality OpenXR plugins version 2.8.2

3)Now attach some script to "RiggedHandLeft" or "RiggedHandRight" prefab which will capture the mesh of the SkinnedMeshRenderer component associated with the tracked hand prefab.

You can try using this mono script, just attach it to "RiggedHandLeft" or "RiggedHandRight" prefab and trigger "BakeMesh()" when the hand is tracking in HoloLens 2.
https://gist.github.com/krishx007/a645f77c1284b2f69479da8e3266fce1

---------Expected behavior----------
Invoking SkinnedMeshRenderer.BakeMesh() with tracked hand mesh in HoloLens 2 should generate a mesh geometry similar to the current deformation of the hand mesh at the exact same position where the tracked hand mesh was visible.

AzureSpatialAnchors.dll | FileNotFoundException Build Error

ASA: 2.11.0
MRTK: 2.7.2
Unity: 2020.3.26f1
AR Foundation: 4.2.2
MR OpenXR Plugin: 1.2.1
OpenXR Plugin: 1.3.1

Trying to build for UWP results in the following error:

FileNotFoundException: Packages/com.microsoft.azure.spatial-anchors-sdk.windows/Plugins/ARM/AzureSpatialAnchors.dll does not exist
System.IO.File.Copy (System.String sourceFileName, System.String destFileName, System.Boolean overwrite) (at <695d1cc93cca45069c528c15c9fdd749>:0)
PostProcessWinRT.CopyFileWithDebugSymbols (System.String source, System.String target, System.Boolean isFileManagedAssembly) (at C:/buildslave/unity/build/PlatformDependent/MetroPlayer/Extensions/Managed/PostProcessWinRT.cs:599)
PostProcessWinRT.CopyPlugins () (at C:/buildslave/unity/build/PlatformDependent/MetroPlayer/Extensions/Managed/PostProcessWinRT.cs:909)
PostProcessWinRT.Process () (at C:/buildslave/unity/build/PlatformDependent/MetroPlayer/Extensions/Managed/PostProcessWinRT.cs:164)
UnityEditor.UWP.BuildPostprocessor.PostProcess (UnityEditor.Modules.BuildPostProcessArgs args) (at C:/buildslave/unity/build/PlatformDependent/MetroPlayer/Extensions/Managed/ExtensionModule.cs:86)
Rethrow as BuildFailedException: Exception of type 'UnityEditor.Build.BuildFailedException' was thrown.
UnityEditor.UWP.BuildPostprocessor.PostProcess (UnityEditor.Modules.BuildPostProcessArgs args) (at C:/buildslave/unity/build/PlatformDependent/MetroPlayer/Extensions/Managed/ExtensionModule.cs:90)
UnityEditor.Modules.DefaultBuildPostprocessor.PostProcess (UnityEditor.Modules.BuildPostProcessArgs args, UnityEditor.BuildProperties& outProperties) (at <55729f52d042492e9efc384182ae2feb>:0)
UnityEditor.PostprocessBuildPlayer.Postprocess (UnityEditor.BuildTargetGroup targetGroup, UnityEditor.BuildTarget target, System.String installPath, System.String companyName, System.String productName, System.Int32 width, System.Int32 height, UnityEditor.BuildOptions options, UnityEditor.RuntimeClassRegistry usedClassRegistry, UnityEditor.Build.Reporting.BuildReport report) (at <55729f52d042492e9efc384182ae2feb>:0)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr, Boolean&)

dwm.exe crashes and restarts when activating depth debug overlay from Device Portal

When running an OpenXR app on HoloLens 2 and activating the hologram stability debug visualization from Device Portal, the app itself and dwm crash. This makes debugging hologram stability on OpenXR impossible.
Any other app that uses HolographicSpace is fine, I tested it with the now deprecated Unity Windows XR plugin.

HoloLens OS version: 20348.1018
Unity 2020.3.17
Unity OpenXR plugin: 1.2.8
Microsoft OpenXR plugin: 1.0.3
OpenXR runtime: 107.2109.10010.0

Remote app listening error

I'm trying to connect the Player to the Remote with the remote's listening method. The method throws this exception:

NullReferenceException: Object reference not set to an instance of an object
Unity.Collections.LowLevel.Unsafe.NativeArrayUnsafeUtility.GetUnsafePtr[T] (Unity.Collections.NativeArray`1[T] nativeArray) (at <91dc399f41a440558c716315c7e834ab>:0)
Microsoft.MixedReality.OpenXR.Remoting.AppRemotingSubsystem.InitializeRemoting () (at Library/PackageCache/[email protected]/Runtime/Subsystems/AppRemotingSubsystem.cs:170)
Microsoft.MixedReality.OpenXR.Remoting.AppRemotingPlugin.OnSystemChange (System.UInt64 systemId) (at Library/PackageCache/[email protected]/Runtime/FeaturePlugins/AppRemotingPlugin.cs:78)
UnityEngine.XR.OpenXR.Features.OpenXRFeature.ReceiveNativeEvent (UnityEngine.XR.OpenXR.Features.OpenXRFeature+NativeEvent e, System.UInt64 payload) (at Library/PackageCache/[email protected]/Runtime/Features/OpenXRFeature.cs:634)
UnityEngine.XR.OpenXR.OpenXRLoaderBase.ReceiveNativeEvent (UnityEngine.XR.OpenXR.Features.OpenXRFeature+NativeEvent e, System.UInt64 payload) (at Library/PackageCache/[email protected]/Runtime/OpenXRLoader.cs:635)
(wrapper native-to-managed) UnityEngine.XR.OpenXR.OpenXRLoaderBase.ReceiveNativeEvent(UnityEngine.XR.OpenXR.Features.OpenXRFeature/NativeEvent,ulong)
UnityEngine.IntegratedSubsystemDescriptor`1[TSubsystem].Create () (at <2e588915cb5341eb91148de7ca64db2c>:0)
UnityEngine.IntegratedSubsystemDescriptor`1[TSubsystem].CreateImpl () (at <2e588915cb5341eb91148de7ca64db2c>:0)
UnityEngine.IntegratedSubsystemDescriptor.UnityEngine.ISubsystemDescriptor.Create () (at <2e588915cb5341eb91148de7ca64db2c>:0)
UnityEngine.XR.Management.XRLoaderHelper.CreateSubsystem[TDescriptor,TSubsystem] (System.Collections.Generic.List`1[T] descriptors, System.String id) (at Library/PackageCache/[email protected]/Runtime/XRLoaderHelper.cs:118)
UnityEngine.XR.OpenXR.OpenXRLoaderBase.CreateSubsystem[TDescriptor,TSubsystem] (System.Collections.Generic.List`1[T] descriptors, System.String id) (at Library/PackageCache/[email protected]/Runtime/OpenXRLoader.cs:486)
UnityEngine.XR.OpenXR.OpenXRLoaderBase.CreateSubsystems () (at Library/PackageCache/[email protected]/Runtime/OpenXRLoader.cs:267)
UnityEngine.XR.OpenXR.OpenXRLoaderBase.InitializeInternal () (at Library/PackageCache/[email protected]/Runtime/OpenXRLoader.cs:240)
UnityEngine.XR.OpenXR.OpenXRLoaderBase.Initialize () (at Library/PackageCache/[email protected]/Runtime/OpenXRLoader.cs:185)
UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
Microsoft.MixedReality.OpenXR.Remoting.StartOrStopXRHelper:Start() (at Library/PackageCache/[email protected]/Runtime/Subsystems/AppRemotingSubsystem.cs:466)

I've cloned the repo, opened the RemotingSample and started the RemotingSampleScene without any modification. I've tried it with Unity 2021.3.29, 2021.3.27 and 2020.3.44 and the results are the same.

image

Spinning White Balls when pressing the power button then resuming app

Note: This issue has been moved from this other repo for better visibility WikkidEdd/OpenXRTestProject#2

Describe the bug

When you press the power button of the HL2 when the app is running and then resume it can result in indefinitely seeing spinning white balls. This usually happens after one or two cycles of putting the device on standby and then resuming the app, but can sometimes take more attempts.

To reproduce

Steps to reproduce the behavior:

  1. Clone this git repo
  2. Open Basic Sample Unity project
  3. Switch to UWP build platform
  4. Build the application.
  5. Switch solution config to Release and ARM64
  6. Build and Start the application
  7. Wait until the app has started, wait 5 seconds, then short press on the power button to put the HL2 to sleep
  8. Wait another 5 seconds, press the power button to wake up the device. Resume the app from the app slate in the world (or from the start menu if the app slate doesn't exist yet.
  9. Repeat steps 7 and 8 until the app no longer resumes and shows the spinning white balls.

Expected behavior

The application should resume without white spinning balls.

Your setup (please complete the following information)

  • Unity Version 2020.3.19
  • Unity OpenXR Package 1.2.8
  • Mixed Reality OpenXR Package 1.0.3 and 1.1.1 tested
  • Holographic OS 20348.1432
  • OpenXR Runtime Version 107.2109.10010.0

Target platform (please complete the following information)

  • HoloLens 2

Additional context

If you run with the debugger attached when the problem occurs you will get an exception at the point you press the power button prior to seeing the white spinning balls.

Callstack (trimmed to make it more readable)
GameAssembly.dll!OpenXRLoaderBase_Internal_EndSession GameAssembly.dll!OpenXRLoaderBase_StopInternal GameAssembly.dll!OpenXRLoaderBase_ReceiveNativeEvent GameAssembly.dll!ReversePInvokeWrapper_OpenXRLoaderBase_ReceiveNativeEvent GameAssembly.dll!OpenXRLoaderBase_Internal_PumpMessageLoop GameAssembly.dll!OpenXRLoaderBase_ProcessOpenXRMessageLoop GameAssembly.dll!_ClearLastSubmittedFrame_Invoke GameAssembly.dll!BeforeRenderHelper_Invoke GameAssembly.dll!Application_InvokeOnBeforeRender GameAssembly.dll!RuntimeInvoker_FalseVoid GameAssembly.dll!il2cpp::vm::Runtime::Invoke

Holographic Remoting Instantly Disconnects

Hi guys, I'm having issues with the holographic remoting aspect of the basic sample app.

ISSUE:
Upon attempting to connect the HAR player app and the remote app on PC, the HAR player app will display connecting for a few seconds, but then disconnect. On the remote app, it will begin connecting for a second and I can see the basic sample menu appear for a moment, but then immediately disconnects yielding a disconnect message of "Disconnected to XX.XXX.XX. Reason is None".

EXPECTED BEHAVIOR:
Upon clicking connect in the remote app with the HAR player in "listen" mode or upon launching the HAR player in "connect" mode with the remote app in "listen" mode, the basic sample app should begin streaming on the HoloLens.

HARDWARE, SOFTWARE, AND SETTINGS USED:
Hardware:

  • HoloLens 2
  • Windows 10 PC (CPU: Intel i7 8700, GPU: NVIDIA GeForce GTX 1070)

Software:

Unity Settings:

  • Packages Used
    Packages

  • OpenXR UWP Settings
    OpenXRUWP

  • OpenXR PC Standalone Settings
    OpenXRStandalone

  • UWP Capabilities Settings
    Capabilities

  • UWP Build Settings
    UWPBuild

  • PC Standalone Build Settings
    PCBuild

SOLUTIONS ATTEMPTED:

  • Built the remote app both as a UWP and Win32 app, both exhibit same issue
  • Tried the HAR sample app in both listen and connect modes using the command line arguments in visual studio. Both modes attempt to connect (displays connecting) then instantly disconnect
  • Tried using the production HAR app from the Microsoft store while using remote app in connect mode. Same issue of instant disconnection encountered.
  • Ensured both remote app and visual studio were allowed though windows firewall
  • Double checked OpenXR settings against read me documentation provided here

SIMILAR REMOTING ISSUE:
I've encountered a similar issue while attempting to use the holographic remoting for play mode in Unity. While attempting to do this, the HAR app also displays connecting for a second, then immediately disconnects and the scene begins playing in Unity game view as normal. Details of that issue can be found in the comment in my name (posted two days ago) in the following issue: microsoft/MixedReality-HolographicRemoting-Samples#50

Thanks for all your help and any information you could provide regarding this issue would be greatly appreciated.

-Ian

1.7 Release testing - App freezes on disconnect in app remoting - basic sample

Unity version used: 2020.3.35/ 2021.3.1 (oculus integration installed, not sure if this is the reason)
Unity MR Plugin: 1.7
Branch used: https://github.com/microsoft/OpenXR-Unity-MixedReality-Samples/tree/Update-to-MR-OpenXR-Plugin-1.7-release

Repro steps:

  • Checkout above branch, open basic sample
  • Build app remoting standalone/UWP app
  • Run the app and connect to player app running on HL2
  • Disconnect using "Disconnect" button in the app
  • observe that 2D UI does not appear on the PC and it freezes on disconnect.

Logs:

Look towards the end of the logs - It is stuck on seemingly an MRTK issue - ArticulatedHand Controller Right was never registered with the Input Manager!

[XR] [31292] [13:00:52.351][Info ] ==== OpenXR Swapchain Details ====
[XR] [31292] [13:00:52.351][Info ]
[XR] [31292] [13:00:52.351][Info ] Render Mode: Single Pass Instanced
[XR] [31292] [13:00:52.351][Info ] Depth Submission Mode: Depth24Bit
[XR] [31292] [13:00:52.351][Info ]
[XR] [31292] [13:00:52.351][Info ] Swapchain Formats: [c:91] 29 87 28 40 [d:20] 45 55
[XR] [31292] [13:00:52.351][Info ] Swapchain Views: (2)
[XR] [31292] [13:00:52.351][Info ] [0]: Width=1584, Height=1030, SampleCount=1
[XR] [31292] [13:00:52.351][Info ]
[XR] [31292] [13:00:52.351][Info ] ==== Last 20 non-XR_SUCCESS returns ====
[XR] [31292] [13:00:52.351][Info ] [SUCCESS] xrGetReferenceSpaceBoundsRect: XR_SPACE_BOUNDS_UNAVAILABLE (2x)
[XR] [31292] [13:00:52.351][Info ] [FAILURE] xrSuggestInteractionProfileBindings: XR_ERROR_PATH_UNSUPPORTED (1x)
[XR] [31292] [13:00:52.351][Info ]
[XR] [31292] [13:00:52.351][Info ] ==== Last 20 Events ====
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_UNKNOWN->XR_SESSION_STATE_IDLE
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_IDLE->XR_SESSION_STATE_READY
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_READY->XR_SESSION_STATE_SYNCHRONIZED
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_SYNCHRONIZED->XR_SESSION_STATE_VISIBLE
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_VISIBLE->XR_SESSION_STATE_FOCUSED
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_FOCUSED->XR_SESSION_STATE_VISIBLE
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_VISIBLE->XR_SESSION_STATE_SYNCHRONIZED
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_SYNCHRONIZED->XR_SESSION_STATE_STOPPING
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_STOPPING->XR_SESSION_STATE_IDLE
[XR] [31292] [13:00:52.351][Info ] XrEventDataSessionStateChanged: XR_SESSION_STATE_IDLE->XR_SESSION_STATE_EXITING
[XR] [31292] [13:00:52.351][Info ]

==== End Unity OpenXR Diagnostic Report ====

[XR] [MROpenXR][Info ][13:00:52.352002][tid:7a3c] RemoteSpeechProvider_ProviderUnregistered
[AppRemotingPlugin] OnInstanceDestroy, remotingState was Disconnecting.
[AppRemotingSubsystem] DeinitializeLoader
Setting up 12 worker threads for Enlighten.
Thread -> id: 792c -> priority: 1
Thread -> id: 8a38 -> priority: 1
Thread -> id: 446c -> priority: 1
Thread -> id: 42e8 -> priority: 1
Thread -> id: 9078 -> priority: 1
Thread -> id: 3778 -> priority: 1
Thread -> id: 6880 -> priority: 1
Thread -> id: 7ddc -> priority: 1
Thread -> id: 8ed4 -> priority: 1
Thread -> id: 2024 -> priority: 1
Thread -> id: 7fc0 -> priority: 1
Thread -> id: 6cec -> priority: 1
[MRTK Issue] ArticulatedHand Controller Right was never registered with the Input Manager!

Image stabilization problems after resuming the app

Note: This issue has been moved from this other repo for better visibility WikkidEdd/OpenXRTestProject#3

Describe the bug

If you use the start menu to put the app into the background and then tap the app slate to resume it results in the stabilization being disabled.

To reproduce

Steps to reproduce the behavior:

  1. Clone this git repo
  2. Open Basic Sample Unity project
  3. Switch to UWP build platform
  4. Build the application.
  5. Switch solution config to Release and ARM64
  6. Build and Start the application
  7. Observe that the content in the scene is stable when moving your head
  8. Bring up the start menu and press the home button
  9. Resume the app by pressing the app slate in the world (or from the start menu if the app slate doesn't exist yet).
  10. Observe that the content is now unstable when moving your head

It's probably also worth noting that in this sample project when the issue occurs the whole image is unstable consistently, but in our actual app, we get slightly different behavior. It's like parts of the depth buffer is incomplete, so some parts are stable and some parts are jittering.

Expected behavior

The application should resume with the same quality stabilization as before backgrounding the app

Your setup (please complete the following information)

  • Unity Version 2020.3.19
  • Unity OpenXR Package 1.2.8
  • Mixed Reality OpenXR Package 1.0.3 and 1.1.1 tested
  • Holographic OS 20348.1018
  • OpenXR Runtime Version 107.2109.10010.0

Target platform (please complete the following information)

  • HoloLens 2

Additional context

Holographic remote player lose rendering

Describe the bug
When performing GPU heavy task, frequently the player (Hololens2) lose it rendering while the remote side (PC - Unity app) still renders the running app.
Along the lose of rendering there is deep in the NVENC (NVIDIA encoder) and suddenly the GPU start perform close to 100%. image attached

The application still running (continuous sending the streams (20Mbps) + responsiveness - moving the head changes the hologram perspective)

Can't see anything interesting in the application logs. (remote side)

Disconnect then connect to holographic remote get the streams back (while the application still running)

Note that: The phenomenon solved if I downgrade the Mixed Reality OpenXR Plugin to version 1.4.0

To Reproduce
Steps to reproduce the behavior:

  1. Launch remoting player on the HoloLens
  2. Running unity3d based application (remote side)
  3. Connect from remote side application to player.
  4. Perform GPU intensive task

Remote side (your Windows PC):

  • OS version: Win10 Pro 22H2
  • using Unity version: 2021.3.12f1
  • We are using OpenXR - Mixed Reality OpenXR Plugin v1.7.0
  • Graphics card(s) / graphics driver version: I have 2 GPUs - NVIDIA RTX A5000 x2, driver version: 528.49

Player side (e.g. your HoloLens 2)

  • OS version: 20348.1535.arm64fre.fe_release_svc_sydney_rel_prod.230106-1454
  • We use the store player version 2.9.0

Do you have any idea what can cause that kind of problem?
Have you ever faced such a problem before?
Any kind of work around, or direction to understand the root cause of the problem will be welcomed.

Thanks,
Almog

image

How to use XRAnchorTransferBatch.ImportAsync(Stream) on Hololens 2

openxr version 1.8.0
I use XRAnchorTransferBatch to export anchors, but when I import anchors using XRAnchorTransferBatch.ImportAsync(Stream), null is returned.

Stream stream = new MemoryStream(anchorData);
stream.Seek(0, SeekOrigin.Begin);
var transferBatch = await XRAnchorTransferBatch.ImportAsync(stream);

anchorData is byte[] transfered.
result: transferBatch is null

Build for iOS with compiling error, SpatialAnchorsSample.cs

ASA 2.10.1
MRTK 2.7.2
Unity 20.21.1.15fc1
AR Foundation 4.1.7
ARKit Plugin 4.1.7
Mixed Reality OpenXR Plugin 1.0.1
OpenXR Plugin 1.2.3

Build for Hololens without any problem. When I switch to iOS, SpatialAnchorsSample.cs has compiling errors.
Assets\Scripts\SpatialAnchorsSample.cs(33,19): error CS0246: The type or namespace name 'CloudSpatialAnchorWatcher' could not be found (are you missing a using directive or an assembly reference?)
Assets\Scripts\SpatialAnchorsSample.cs(101,71): error CS0246: The type or namespace name 'AnchorLocatedEventArgs' could not be found (are you missing a using directive or an assembly reference?)

And m_spatialAnchorManager does not have AnchorLocated nor LocateAnchorsCompleted delegate. I guess the delegate problem is related to the namespace problem.

Missing Hand Ray in Unity Mixed Reality Template - Aim Position/Rotation/Flags

I have been testing cross-platform with the Unity MR template in Unity 2022.3.21f1 with Microsoft.MixedReality.OpenXR 1.9.0.

When building with the meta openxr plugin and targeting quest pro, this works as expected.

On HoloLens2, hands track correctly but hand rays do not appear. This appears to be because the values for the Ray interactor pose refer to Aim Position and Aim Rotation, which are present in the Meta OpenXR plugin.

image

Setting these values to use a fallback XR Hand Device Position and Rotation (or explicitly to HoloLens equivalents) works-ish, but I'm unsure as to whether these are the "correct" poses.

image

In this template project it also does not correctly disable/re-enable the pointer after hand tracking is lost/regained, which appears to be due to the flags which are again provided by the Meta plugin and used to switch between poke/direct/ray interactors:

image

MRTK also had a separate InteractionModeManager which seemed to have the same goal.

So my queries are:

  1. How should I be specifying "Aim" position and rotation? Is this something the Microsoft OpenXR plugin can provide?
  2. Is there an equivalent to the "Flags" I can use to determine when I should be using Poke, Direct or Ray interactors?

Unity also don't have issues on their sample github, otherwise I'd also report this there.

Question: Does Mixed Reality OpenXR Plugin version 0.9.2 support Haptic Feedback

Hello, I kinda found my way here because I was going through the Learn.Unity.com "Create with VR" course and am following the lessons using an HP Reverb headset/controllers. The course isn't tailored to WMR headsets, but if you understand that you just have to use the MixedRealityFeatureTool to install the Mixed Reality OpenXR Plugin then after that, there really isn't much difference between the WMR headsets and Oculus or Vive.
Anyhow, I now have come to a possible issue - haptic feedback doesn't seem to be working! Per the lesson which basically just instructs you to select the Left or Right controller and select the XR Ray Interactor component and configure some of the Haptic events, then the controller should be vibrating. ( Create with VR - Lesson 2.1 )
I can provide more detail about my setup if needed, but I figure the first thing is just to confirm that Mixed Reality OpenXR Plugin version 0.9.2 does in-fact support haptic feedback.?
If there's another place I should be looking for this info, please let me know. Thanks!

XRAnchorStore not synced with SpatialAnchorStore

The XRAnchorStore loaded with the XRAnchorStore.LoadAsync() function doesn't contains anchors saved in SpatialAnchorStore with the SpatialAnchorStore.TrySave() function.

I need to restart my app to get the imported anchors.
Reloading the XRAnchorStore does not do the trick.
Reinstanciating the ARSession neither.

Expected behavior : The XRAnchorStore should be synced with the SpatialAnchorStore.

No GetDeviceLayoutName() override detected


Hi, sorry not sure if this is the best place to create this issue, since Unity OpenXR Plugin v1.9.1 (2023-10-19), they added a couple of new API functions for controller profiles (OpenXRInteractionFeature) the first of which, GetDeviceLayoutName() if not overridden results in a warning when loading up a project and when running it in Play mode/Standalone.

i.e.

No GetDeviceLayoutName() override detected in HP Reverb G2 Controller Profile. Binding path validator for this interaction profile is not as effective. To fix, add GetDeviceLayoutName and GetInteractionProfileType override in this profile.
UnityEditor.EditorApplication:Internal_CallUpdateFunctions ()

Getting `OnInstanceDestroy, remotingState was Connect` when attempting to initiate holographic remoting connection.

I am unable to establish a holographic remoting session due to the following error:

[AppRemotingPlugin] OnInstanceDestroy, remotingState was Connect.
UnityEngine.DebugLogHandler:Internal_Log(LogType, LogOption, String, Object)
UnityEngine.DebugLogHandler:LogFormat(LogType, Object, String, Object[])
UnityEngine.Logger:Log(LogType, Object)
UnityEngine.Debug:Log(Object)
Microsoft.MixedReality.OpenXR.Remoting.AppRemotingPlugin:OnInstanceDestroy(UInt64)
UnityEngine.XR.OpenXR.Features.OpenXRFeature:ReceiveNativeEvent(NativeEvent, UInt64)
UnityEngine.XR.OpenXR.OpenXRLoaderBase:ReceiveNativeEvent(NativeEvent, UInt64)
UnityEngine.XR.OpenXR.OpenXRLoaderBase:Internal_DestroySession()
UnityEngine.XR.OpenXR.OpenXRLoaderBase:Deinitialize()
UnityEngine.XR.OpenXR.OpenXRLoaderBase:Initialize()
UnityEngine.XR.Management.<InitializeLoader>d__24:MoveNext()
UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
UnityEngine.MonoBehaviour:StartCoroutineManaged2(IEnumerator)
UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
Microsoft.MixedReality.OpenXR.BasicSample.AppRemotingSample:ConnectToRemote(String)
UnityEngine.Events.UnityAction`1:Invoke(T0)
UnityEngine.Events.InvokableCall`1:Invoke(T1)
UnityEngine.Events.CachedInvokableCall`1:Invoke(Object[])
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.UI.Button:Press()
UnityEngine.UI.Button:OnPointerClick(PointerEventData)
UnityEngine.EventSystems.ExecuteEvents:Execute(IPointerClickHandler, BaseEventData)
UnityEngine.EventSystems.EventFunction`1:Invoke(T1, BaseEventData)
UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction`1)
UnityEngine.EventSystems.StandaloneInputModule:ReleaseMouse(PointerEventData, GameObject)
UnityEngine.EventSystems.StandaloneInputModule:ProcessMousePress(MouseButtonEventData)
UnityEngine.EventSystems.StandaloneInputModule:ProcessMouseEvent(Int32)
UnityEngine.EventSystems.StandaloneInputModule:ProcessMouseEvent()
UnityEngine.EventSystems.StandaloneInputModule:Process()
Microsoft.MixedReality.Toolkit.Input.MixedRealityInputModule:Process()
UnityEngine.EventSystems.EventSystem:Update()

Steps to reproduce:

  1. Clone this repository
  2. Check out the 'main' branch or the branches with tags 'v1.6.0' or 'v1.7.0'. (I get same result on all these versions) (remoting works just fine with tag 'v1.5.0')
  3. Run the sample project in OpenXR-Unity-MixedReality-Samples\RemotingSample\ (i'm using Unity version 2020.3.37f1)
  4. Set target platform to 'Universal Windows Platform' and build.
  5. In Visual studio, open project and use 'x64' and 'local machine' settings running in debug.
  6. Launch the holographic remote player on Hololens 2
  7. Type in IP address in the application
  8. Press 'connect'
  9. You should see the OnInstanceDestroy error from above. No connection is made.

According to this site: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/known-issues
Could this issue still be affecting OpenXR version 1.6 and 1.7?

image

Any help is greatly appreciated. Thank you!

Hand Mesh doesn't appear in HandTracking scene

I am attempting to see the Hand Mesh visualization in the Hand Tracking scene using Holographic Remoting for Play Mode. The hand joints are tracking like they should, but toggling "Hand Mesh" on and off does nothing. In HandMesh.cs, the method on line 35, handMeshTracker.TryGetHandMesh(FrameTime.OnUpdate, mesh), always returns false.

Remote speech limitations

Mixed Reality OpenXR plugin appears to implement XR_MSFT_holographic_remoting_speech by somehow interfacing with the pre-existing UnityEngine.Windows.Speech APIs. This implementation can cause issues where remote speech doesn't work depending on the speech recognition settings of the machine running the remote application.

To reproduce:

  • Open the Basic Sample app from this repo
  • Enable 'Holographic Remoting remote app' OpenXR feature for Standalone
  • Run the MainMenu scene from Unity, or build and run Standalone
  • Use the AppRemoting window to connect to a HL2 running Holographic Remoting Player or similar
  • Navigate to the Speech scene
  • Observe that speech commands in the Player ('red', 'blue', etc) may or may not work depending on the speech settings of the remote PC

In a non-working scenario we will see in the logs a UnityException: Speech recognition is not supported on this machine. during initialization of speech recognizers. This is not specific to this sample, it is reproducible in any app using Mixed Reality OpenXR Plugin and the issue occurs regardless whether speech recognition is initialized before or after holographic remoting is connected.

This isn't an inherent limitation of remote speech, as remote speech functionality in the native remoting sample app works perfectly fine even in scenarios where Mixed Reality OpenXR plugin fails.

This becomes a real issue when you want to deploy a holographic remoting executable. End-users potentially need to mess around with speech and language settings (potentially requiring Administrator access) on the remote PC to get remote speech working even though this is not a real technical requirement.

I suspect this might be (at least in part) an issue on the Unity side but even if so, a resolution could be to expose the underlying remote speech OpenXR APIs via the Mixed Reality OpenXR plugin in a convenient way. These APIs would be guaranteed to work regardless of speech settings on the remote PC side. This could pair nicely with a 'don't integrate with UnityEngine.Windows.Speech' option on the Holographic Remoting remote app feature which would allow apps to separately make use of speech recognition on both the remote and the player side through the two separate APIs.

How to reset session when entering a new space?

I've recently found that ARSession.state does not directly correspond to whether the device pose is "tracked".

Consider the following scenario:

  1. Device is performing as normal (device pose == tracked, ARSession.state == SessionTracking)
  2. Sensors get covered (device pose == not tracked, ARSession.state == SessionInitializing)
  3. Device is moved to a new location, sensors are uncovered (device pose == tracked, ARSession.state == SessionInitializing). In this phase, hands and device are tracked correctly (but in an arbitrary offset to the XR Origin). Meshes (ARMeshManager) and QR codes do not seem to function.
  4. Device is moved back to original location, recognizes space (device pose == tracked, ARSession.state == SessionTracking)

In many cases, this makes sense: objects are anchored to the original space, meshes created, QR codes located, etc. However, this causes problems if the user actually wants to continue using the app in the new space.

I'd like to forcefully reset the session (start in a new arbitrary coordinate system) after (3) so that the app can continue in the new space, building new spatial meshes etc..

I've found that ARSession.Reset() does not seem to resolve this; the state just remains as SessionInitializing and AR features do not work until the original space is rediscovered. XRInputSubsystem.TryRecenter() returns true but does not fix the ARSession state.

So questions are:

  1. Is this expected behaviour or is something more complex going on in my setup?
  2. How do I go about resetting the session / allowing the app to continue in the new location?

Setup

Unity 2022.3.8f1 (Windows)
HoloLens 2
MRTK3 pre-16
Mixed Reality OpenXR Plugin 1.7.2
Unity OpenXR Plugin 1.8.2
ARFoundation 5.0.7

Compatibility Issue with UnityEngine.Windows.Speech namespace ?

Hello, I'm posting this in a few places as I'm not sure exactly who is the culprit for this issue.

I'm making a VR training app and the target platform is Standalone>Windows. I'm using Unity 2020.3 and OpenXR. See below for specific versions.

I have a script that uses the UnityEngine.Windows.Speech namespace that lets me subscribe to the PhraseRecognizedDelegate OnPhraseRecognized event. ( this API is part of the UnityEngine.CoreModule.dll )

My Issue: Everything works great EXCEPT when I have WMR headset connected and the WMR Portal running with Mixed Reality OpenXR runtime.

I have tested that my voice commands DO work fine when running in editor (without WMR headset attached), as a built .exe (again without WMR headset attached), AND even works with Oculus Quest (via Link and the Oculus OpenXR runtime.)

My voice commands DO NOT work with either HP Reverb G1 or G2 and running the WMR portal and the Mixed Reality OpenXR runtime.

I have verified that the microphone device is working as I created a simple echo script that plays what the microphone hears back to me.

As another data point I have an older VR project that I built with Unity 2019.4 with the same target but that uses the Windows Mixed Reality plugin (not OpenXR). The voice commands DO work fine in that app.

So this kinda points me to some problem with the Mixed Reality OpenXR runtime as being compatible with the UnityEngine.CoreModule.dll and specifically with the UnityEngine.Windows.Speech namespace.

Any help with this would be appreciated. Thank you!

SW Versions:
Windows 11 Pro
Unity 2020.3.16
WMR Runtime 107.2109.10010
Packages Include:
XR Plugin Management 4.07
XR Interaction Toolkit 1.0.0-pre.5
OpenXR Plugin 1.2.8
Microsoft Plugins:
Mixed Reality OpenXR Plugin 1.0.2
Mixed Reality Input 0.9.2006
Microsoft Spatializer 1.0.196

Gap of the pinch gesture

Lately we recieved user feedbacks that they accidently grab objects with the pinch gesture. We've investigated this issue and it turned out that we can grab the virtual content with wider gap between the index finger and the thumb. If i grab a window/3d icon in the HoloLens' main menu the gap is smaller (half compared to the app).

It seems that it is related to the Mixed Reality OpenXR plugin, the only change to the project we made was downgrading the plugin to 1.7.2, and it solved this pinch gesture issue.

To reproduce:

  • Open a Unity 2020 LTS or 2021 LTS project
  • Import MRTK 2.8.3 with MR OpenXR plugin 1.8.0 or above
  • Create MR scene for HoloLens 2
  • Add 3d object with MRTK's ObjectManipulator on it
  • Remote to the editor
  • Grab the object with the pinch gesture in far interection mode

appremoting.cs question

I connected the hololens2,the editor console return [AppRemotingPlugin] Connect InitializeLoader.No conect build.
I tracked the problem and found XRGeneralSettings.Instance.Manager.InitializeLoader() can not create XRGeneralSettings.Instance.Manager.activeLoader.More information is: InitializeInternal()->CreateSubsystems()->CreateSubsystem<XRDisplaySubsystemDescriptor, XRDisplaySubsystem>(s_DisplaySubsystemDescriptors, "OpenXR Display");-> base.CreateSubsystem<TDescriptor, TSubsystem>(descriptors, id); -> subsys = descriptor.Create(); can not work.
Hope to get a reply quickly.

AppRemoting: DisconnectReason not getting reported

We are using the AppRemoting.StartConnectingToPlayer to connect Unity to Hololens but when there is an error the AppRemoting.Disconnecting always reports DisconnectReason.None.

I believe the issue is that AppRemotingSubsystem.OnSessionLossPending calls TryGetConnectionState and thus "consumes" the disconnect reason, so when later the AppRemotingSubsystem.ConnectRoutine calls TryGetConnectionState it doesn't get the reason.

I have verified this with a debugger. The AppRemotingSubsystem.OnSessionLossPending does get the correct disconnect reason.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.