Giter VIP home page Giter VIP logo

mixed-reality's Introduction

Virtual hummingbird with human hand

Welcome to Mixed Reality documentation, the place for all things MR, VR, and AR at Microsoft!

Contributing

If you're contributing or updating docs content, please make sure your pull requests are targeting the correct sub-docset (mr-dev-docs, enthusiast-guide, and so on). New contributors should check out our more detailed contribution guidelines for each subdocset:

For docs-related issues, use the footers at the bottom of each doc, or submit directly to MicrosoftDocs/mixed-reality/issues.

Feel free to send any questions about contribution policies or processes to Harrison Ferrone or Sean Kerawala via Teams or email.

Getting started

Every path through our docs has a curated journey to help you find your footing. Whether it's design, development, or distributing your apps to the world, we've got you covered.

Mixed Reality for HoloLens

We recommend starting with the Mixed Reality basics and moving on from there:

If you're interested in the design side of things:

When you're ready to start developing, choose the engine and device that best suits your needs:

For engine-specific content, choose one of the following paths:

When you're finally ready to get your app out to your users:

VR enthusiast guide

If you're new to VR devices, we recommend starting with our beginner guide:

Contributor License Agreement (CLA)

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

mixed-reality's People

Contributors

amollis avatar cmcclister avatar cre8ivepark avatar danielescudero avatar grbury avatar harrisonyu avatar hferrone avatar jing1201 avatar kegodin avatar keveleigh avatar mattwojo avatar mattzmsft avatar maxwang-ms avatar mr0ng avatar ms738 avatar polar-kev avatar qianw211 avatar rogpodge avatar rwinj avatar sean-kerawala avatar shengkait avatar sostel avatar species521 avatar szymonsps avatar thetuvix avatar typride avatar varunsiddaraju avatar vinayak0706 avatar vtieto avatar yuripek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mixed-reality's Issues

What is DirectionalIndicatorController script for?

It is not explained in the tutorial what is the reason to use the DirectionalIndicatorController script. This script makes Indicator inactive and if you are unlucky like me you won't see Indicator at all. I thought that solver is not working and then found that it is just disabled. Why? Everything is just fine (and working) without DirectionalIndicatorController script. I suggest to remove it from the tutorial.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

RoverAssembly Disappears When I Move It

FIrst up, the tutorials are great, really clear and easy to follow. Thank you for being so thorough with them.

I'm up to the point though, where I'm testing taptoplace in the Hololens2 and every time I tap to grab the RoverAssembly, I can pick it up and move it slightly, but then it just disappears without me having done the second tap. Testing in the game panel in Unity it works just as expected with the left click to pick up, right to move my gaze, left click to place, but in the headset I loose it everytime. I'm just in a small room, testing it on the floor in front of me.

I've doubled checked my steps and I have everything just as the tutorial says, so I'm a bit stumped. Any known trouble spots for the taptoplace movement?

Many Thanks!

[Enter feedback here]


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

'Enable Tap to Place' voice command is not working on the device

I have set all as it was described in the tutorial but last voice command 'Enable Tap to Place' works only in Unity Editor but doesn't on the HoloLens 2. I tried to keep cursor on the Rover, look at it etc. Speech confirmation tooltip doesn't pop up! I checked that eyes recognition was working fine on SeeItSayItLabels. Other voice commands that doesn't require focus are working fine. Maybe it is still an issue with "Is Focus Required" property as it was on MRTK 2.3?

I was using Unity 2019.4.9f and MRTK 2.4.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Indicator only works once

Could you update the tutorial to make it so the indicator works more than once? When you look away the first time, the indicator is visible however, when you glance back at the explorer, the indicator becomes invisible but never turns visible again when you glance away again.

Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

"NotImplementedException" from OwnershipHandler.cs

Hi there! From Multi-user capabilities got a "NotImplementedException" #8311, filling something here as the issue is related to the tutorial :)

As explained in the MRTK issue, I was wondering the reason behind the "throw new NotImplementedException() " inside OnInputUp(), OnOwnershipTransfered() and OnTriggerExit() as it fire an error.
Is it for a future update of the series, or is it something that we (as users) have to fullfill according to our needs ?

The script is located at MRTK.Tutorials.MultiUserCapabilities/Scripts/OwnershipHandler.cs

Additional context

Again, I might be missing an obvious point here because of my lack of knowledge on certain points because of my "weak" background in software development.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Typo and missing screenshot in spatial audio tutorial

This description is missing a screenshot and contains a typo.

image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Recognizer remains active accross scenes.

Good day, I have made an implementation of this for my game and found that the script remains active even when changing level, is there any way to lock the recognizer to the scene?
LightBuzz/Speech-Recognition-Unity#4


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

provided link to the paper isn't working

[Enter feedback here]
the link provided at line# 314 of the webpage underhttps://docs.microsoft.com/en-us/windows/mixed-reality/mixed-reality

tag which is :-
<a href="https://etclab.mie.utoronto.ca/people/paul_dir/IEICE94/ieice.html" data-linktype="external">A Taxonomy of Mixed Reality Visual Displays</a> links to the web page (https://etclab.mie.utoronto.ca/people/paul_dir/IEICE94/ieice.html) isn't functioning anymore.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Possible Typo

[Enter feedback here]
The instructions on this page direct the student to add a second On Placing Started () event. However, the associated screen shot shows the creation of a On Placing Stopped () event. Which is it?


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Missing Capability Check

Hi! I followed these instructions and they were great, but sadly they did not work because of missing capabilities, which resulted in several hours of debugging. Users following these instructions should know that they need to set

  1. Internet (Client and Server)
  2. Private Networks (Client and Server)

Also another issue is that when I set these in Unity under Player settings, they didn't not transfer to the appxmanifest in VS, which was very confusing. They have to be set directly in the built project's Package.appxmanifest :)


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Deploying to Hololens 2 might need different settings (when deploying over wifi)

"Configure Visual Studio for HoloLens by selecting the Master or Release configuration, the ARM64 architecture, and Device as target" <- this only works when your Hololens is connected to your PC via USB AND you are also able to access it via the device portal by opening https://127.0.0.1:10080 in your browser.
However, this (for some reason) often does not work (I, for example currently get the following error: SSL_ERROR_RX_RECORD_TOO_LONG in firefox).

If your Hololens 2 is paired to your PC AND you can access it via the device portal by opening its real network address in your browser, then you can deploy over wifi, but with different settings. Hence, I suggest adding the following text tot he tutorial:

"To deploy over wifi, configure Visual Studio for HoloLens by selecting the Master or Release configuration, the ARM64 architecture, and Remote Machine as target. Then go to Project -> Properties -> Configuration Properties -> Debugging. Make sure Configuration is set to Release and Platform to ARM64. Then, in the dropdown Debugger to launch: select Remote Machine and add the real network address of your Hololens as Machine Name.

I also suggest adding an image like this one (feel free to use this).


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Spatial Audio tutorial missing explanations

This tutorial states that the user will learn how to use HRTK offload but there's no description of what HRTF offload is or why you would choose it.

There is also no explanation of what it means to "spatialize audio" and why you would want this.

Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Missing React-Native development frameworks

The document can include http://components.magicscript.org/ which is a React-Native based framework which runs on ARKit and ARCore.

Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Incorrect Prerequisite for Azure Spatial Anchors tutorial

One of the prerequisites for the Azure Spatial Anchors tutorial is having already completed the tutorial,
"Completed the Azure Spatial Anchors tutorials series or previous experience creating an Azure Spatial Anchors Account"


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

"Unity Sample Page" links to deprecated git repo

When you click the link to "Unity Sample Page" towards the bottom of the page it takes you to

https://github.com/sceneunderstanding-microsoft/unitysample

Which says the repo has moved to https://github.com/microsoft/MixedReality-SceneUnderstanding-Samples

The link the docs should be updated to new repo


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Failed to connect to Azure storage on Hololens2 device

[Enter feedback here]
Good afternoon,

I was following the Azure storage tutorial and was able to run the app in the Unity Editor just fine, but when I deployed it to the app and attempted to set an object I was met with this error:
'Failed to connect with Azure Storage. If you are running with the default storage emulator configuration, please make sure you have started the storage emulator.'

Is there a solution or work around for this?

Thank you,
Nick


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

MRTK profile tutorial missing explanations

In the Overview this tutorial says that,

This particular example will show you how to hide the spatial awareness mesh by changing the settings of the Spatial Mesh Observer.

No where in the tutorial does it explain what a spatial awareness mesh is or why its settings should be changed. There is also no recommendation on which setting is best practice for future projects.

This tutorial is also missing general explanations on what a profile is and how it should be used. For someone new to Mixed Reality and MRTK, this can be confusing.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Issue with Photon Unity Networking after following Multi-user capabilities tutorial

I have followed the guide in the documentation here:
https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/tutorials/mr-learning-sharing-03
multiple times, making sure I follow the instructions exactly, but no matter what I do, there is always ~114 errors relating to Photon when I try to build the unity project.
The errors are basically variations of as follows:
Assets\Photon\PhotonChat\Code\ChatAppSettings.cs(17,11): error CS0246: The type or namespace name 'ExitGames' could not be found (are you missing a using directive or an assembly reference?) Assets\Photon\PhotonChat\Code\ChatClient.cs(1194,46): error CS0246: The type or namespace name 'EventData' could not be found (are you missing a using directive or an assembly reference?) Assets\Photon\PhotonChat\Code\ChatClient.cs(941,14): error CS0538: 'IPhotonPeerListener' in explicit interface declaration is not an interface Assets\Photon\PhotonRealtime\Code\ConnectionHandler.cs(20,26): error CS0246: The type or namespace name 'ExitGames' could not be found (are you missing a using directive or an assembly reference?) UnityEditor.BuildPlayerWindow+BuildMethodException: 114 errors at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x002bb] in <8004fcc221b54f98ba547350ea71d812>:0 at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x00080] in <8004fcc221b54f98ba547350ea71d812>:0 UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)
I've tried in both 2019.4.5f1 & 2019.4.10f1.
I've successfully exported many unity projects to the HoloLens 2 I'm using, but cannot get the multi-user experience to work.

Sorry if this is not enough info to help, feel free to request any other info.
Thanks in advance.

Warnings importing Azure Spatial Anchors package

Unity version 2019.4.10f1

This tutorial should call out that there are warnings related to legacy XR/XR SDK when importing this package and what to do about. The warning instructs the user to update to the new Unity XR plugin system so this tutorial should instruct the user on how to proceed.

image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Incomplete set of latest voice commands

It looks like the dev docs haven't been updated with the latest set of system voice commands from the 20H1 release documented here: https://docs.microsoft.com/en-us/hololens/hololens-cortana

This could cause issues for developers because the system commands are reserved from apps being able to use.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Expectations for testing spatialized audio not clear

There's no description of what to expect when testing the spatialized button interaction sounds. The user cannot tell if this is working correctly or not.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

this documentation is bad.

[Enter feedback here]

So frankly, this sucks.

You have strung along 30 different code strings, tiny applications and a crap ton of documentation for people to sift through to make thier $3000 AR headset work. This culture of github sucks and frankly a s a business professional, the last thing i want to go do is sit on a forum. Make this work with one package. These instructions are very vague and require me to have 70 different tabs open all cross referencing each other. Get your act together MRTK. You have had numerous revisions and a few years to work it out. Get it done and stop making us chase accross the internet to make something work.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

"Creating User Interfaces" tutorial needs additional explanations

This tutorial instructs on how to configure components but needs more explanation on what the components do. This tutorial is geared towards people getting started so it should explain the "why" behind the "what" every step of the way.

Interactable
The Interactable component is missing an explanation. An example of what could be included:
The Interactable component is an all-in-one container to make any object easily interactable and responsive to input. Interactable acts as a catch-all for all types of input including touch, hand rays, speech etc and funnel these interactions into events and visual theme responses.

OnClick()
Needs further explanation on how this works with the input system (controllers, hands, focus, pointers).


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

XR Legacy Input Helpers 2.1.4 instead of Legacy Input Helpers 2.1.4

Dear MicrosoftDocs,
at the begin of the page you ask to install "Legacy Input Helpers 2.1.4" but actually I can only see listed another one called "XR Legacy Input Helpers 2.1.4".
I guess this is the valid name.
Thanks for the attention.
Regards
Ennio


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Failed to connect with Azure Storage.

Hi,

When I tried to run the example in this page at 'Writing and reading data from Azure Table storage', I can run the app in the Unity with the Connection String (Azure storage can update the objects). However, when I deployed the app to HoloLens 2, there is an error, 'Failed to connect with Azure Storage. If you are running with the default storage emulator configuration, please make sure you have started the storage emulator.'. Could you explain the reason for me? Thank you!
[Enter feedback here]


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

says nothing about running the app.

[So, this document outlines NOTHING about how to run the app you just produced. It does not say, the app pops up in front of you or that you have to open it, or anything like that. ]


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Missing information about Source Code Editor could lead to Error while packaging.

Hey!

Keep in mind that you have to select Visual Studio 2019 for your Source Code Editor, if you have UE already installed with another Editor.

Otherwise there will be an error while packaging!


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

AMAZING TUTORIAL

I missed something from the tutorial but then found it re reading, so I closed the issue. Amazing Tutorial.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

The SampleQRCodes Unity project doesn't work on Hololens 2

Newer Unity versions (2020.1.6f1) when building the SampleQRCodes Unity project and deploying it to Hololens 2, no QR codes are being tracked and the Axis object is just stationary in front of eyes of the wearer.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Project config/legacy XR settings missing explanations

Project Configuration Settings are not Explained

This tutorial is geared towards beginners yet does not explain the configuration settings. The user is told to simply check/select settings but is not told why or what these settings do with respect to the project or the hololens 2 device. These config settings include:

  • MS HRTF Spatializer
  • Depth Format
  • Spatial Awareness Layer
  • Single Pass instanced render path
  • Legacy XR API

Each of them needs an explanation as they will all leave beginners confused as to what they are doing.

No message about Built-in XR deprecation

Warnings are likely to confuse users who are new to Unity/MR development. The Unity editor provides a warning about built-in XR being deprecated in a future version of Unity. This should be addressed in the tutorial either via a link explaining what's happening next (ie. XR SDK) or through a short summary.

image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Links to next page in tutorial series should be a button

Instead of using text links to continue to the next tutorial in a series, we should use big buttons that grab the attention of the user.

Here is an example from the Microsoft Docs Blob storage tutorial
image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Documentation inconsistent with HL1 emulator requirements

The installer of the HL1 emulator currently requires VS2017 to successfully complete if you do a complete install or want to install the templates. The documentation does not mention this and only requires VS2019.
Please either fix the installer or update the documentation to clarify that to install the HL1 emulator with Visual Studio 2019 you need to deselect the VS templates and manually add them later on, or you need VS 2017 installed.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Click eye for Rover Assembly not Rover Explorer

Your document says:
To make it easier to work with your scene, in the Hierarchy window, click the eye icon to the left of the object to toggle the scene visibility for the RoverExplorer object off. This hides the object in the Scene window without changing its in-game visibility:

This makes everything invisible even the parts that you are manipulating. It should say:
To make it easier to work with your scene, in the Hierarchy window, click the eye icon to the left of the object to toggle the scene visibility for the RoverAssembly object off. This hides the object in the Scene window without changing its in-game visibility:

Thank you.

Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Hololens 2 Getting Started Introduction page has a link to itself

This page lists links for the tutorial series that the user is about to begin. The list does not clearly indicate where the user is on the journey and includes a link to the page the user is already on.

image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

In my Hololens 2, Developer Mode option is grayed out and I am not able to turn it on. Will need more clarification.

So I have received Hololens 2 from a friend and I noticed my Developer Mode option is grayed out and I am unable to turn it on. I stumbled upon this documentation but unfortunately there isn't useful information available for that case. We will need to add appropriate steps to solve the issue.

Best,
Samvid


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

The examples do not work as described.

The documentation is out of date, wrong, or incomplete. Cannot make the simplest action work as described in this document. This is true of practically all the Unity and MRTK documentation. Go ahead as usual and close the ticket, not enough information is provided. I cannot explain more. Get a new computer, install all the software recommended. Follow the instructions, when you get a working example, send it to me. Don't check any box or make any adjustment that is not described in the documentation.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Wrong button blueprint used

Under "Adding a button" section the instructions say to use BP_SimpleButton. You should actually use BP_ButtonHololens2, like the screenshot shows.


Document details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

LUIS app creation UI has changed, instructions do not match UI

The luis.ai webpage UI has changed and there are now options that don't match with what the user is instructed to do.

Old UI for app creation
image

New UI for app creation
image

Old UI for entity creation. The tutorial instructs the user to select a Simple Entity but this option is not available.
image

New UI for entity creation
image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Incorrect button blueprint class listed?

Under Adding a button step 1 it says to add a blueprint class of type BP_SimpleButton, but the image below that shows a BP_ButtonHoloLens2 class being selected. That latter appears to be the correct class, based on the settings that need to be changed (i.e. icon).


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

QR codes with Logos documentation

Not sure if this is sufficient enough information to add to the docs, but the current documentation states "QR codes with logos have not been tested and are currently unsupported."

I have tested this feature with QR codes partially obscured i.e. a normal code with a logo, other image, or general obstruction placed over the center. The feature was tested with the sample project from @chgatla-microsoft at https://github.com/chgatla-microsoft/QRTracking/tree/master/SampleQRCodes

This works as expected. The QR code is detected and oriented correctly as long as the code is correctly generated with an error correction level that supports the amount of obscurity: https://en.wikipedia.org/wiki/QR_code#Error_correction. The caveat is the usual one for error corrected QR codes, that the code becomes more complex (more bits are included) and may be more difficult to detect than a non-obscured non-error-tolerant code.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Broken link to examples.

Broken link: https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MixedRealityToolkit.Examples/Demos/Solvers/Scenes/SurfaceMagnetismSpatialAwarenessExample.unity


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Tutorial does not describe what to expect after deploying to Hololens

At the "Build and deploy the application" stage of the tutorial, the user has created an empty app and is now deploying that app to the Hololens. Since the tutorial does not describe what the user should see in their Hololens device once the app is deployed, it is very difficult for the user to confirm that the app is working correctly on device.

Fix: There should be a description of what the user should expect from an empty app deployed on a Hololens shown at the beginning of this section.

In addition, the "Build and deploy the application" section has enough content to be its own module in the tutorial series. App deployment is referred to several times in future tutorial modules so it would make sense for it to exist as its own module.


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Example code in c++

Could we get an example of how to scan a QR code and get the info from the code in C#?

Thanks


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

MSBuild for Unity Support

MSBuild for Unity Support is not visible on the users screen in Step 2, but is an option we see in Unity today for the MRTK config settings. Can you please add this so folks have certainty about project initialization? The blurb in the settings pane calls out this as enabling additional HL2 features such as hand joint remoting and depth LSR mode. @martinwilter


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Link for MRTKG Getting started Guide not working.

Hey! The link for "MRTK getting started guide" is leading to a 404.

https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

HoloLens 2 Ue4 Remoting issues

I'm currently remoting a Ue4 application with the HoloLens 2 using the built in Ue4 connection with Holographic remoting. I have a router that I'm using so that my laptop and HoloLens are on the same network, and start the remoting about 5ish feet from the router itself. I notice that if I walk to far away from the router my quality drops both visually and in terms of my line-casts coming from my hands. Most notable, my hand linecasts will jump around leading me to believe they arent interpolating correctly due to poor connection to the laptop. Because of the lines jumping it becomes impossible to air tap select objects.

I've tried turning up the "Max network transfer rate" in Ue4 project settings to no avail. Anyone have any tips? Thanks!

Incorrect function in Tap To Place script instructions

When setting up the Tap To Place button, the instructions guide you through modifying the Tap To Place Script attached to the RoverAssembly. The instructions lead you to set up the wrong function. The image that accompanies this text is correct.

The instructions should say, "On Placing Stopped () event", otherwise the Tap To Place button will not work as described.

image


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Page links to self

Page links to itself. https://docs.microsoft.com/en-us/windows/mixed-reality/surface-magnetism


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Outdated images/info

[Enter feedback here]
We should take more info and images from this doc or just retire this page


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.