immersive-web / webxr-samples Goto Github PK
View Code? Open in Web Editor NEWSamples to demonstrate use of the WebXR Device API
Home Page: https://immersive-web.github.io/webxr-samples/
License: MIT License
Samples to demonstrate use of the WebXR Device API
Home Page: https://immersive-web.github.io/webxr-samples/
License: MIT License
Some of the examples (ie reduced-bind-rendering.html) are fine to use on an AR device.
Could the code be updated so it checks for immersive-ar
?
It may save a few headaches to mention in either the README or on the pages themselves the secure context requirement for WebXR, needing either https or localhost/127.0.0.1, and how to go about that. Via #47
This year the WebXR API in Chrome has been going through a lot of changes and that is fine. However, I am starting to wonder if I have something incorrectly setup. I have been unable to get WebXR going on my desktop HMDs since Chrome 74.
We just get an exception when 'requestSession' is called with 'immersive-vr'. Note: I am only interested in desktop VR, not mobile, which seems to be more common.
navigator.xr.requestSession('immersive-vr').then(onSessionStarted);
Version 77.0.3865.90 (Official Build) (64-bit)
All I need is a sample that works on a current build of Chrome with Oculus Rift or Vive.
Is it possible I have flags not configured correctly? I have tried enabling/disabling everything and various combinations, and to no avail. Always the Uncaught DOMException before any client code is called.
My current solution is to downgrade to Chrome 74, but there must be a reason nothing is working.
Please mr @toji , any tips would save me right now.
The explainer's rAF callback section says:
(Timestamp is given for compatibility with window.requestAnimationFrame(). Reserved for future use and will be 0 until that time.)
Why is that, and how are applications supposed to get an accurate target timestamp for animations that matches the expected time when the frame is going to be displayed? That's important for smooth animations.
Is there a field in the frame data that's supposed to be used for this?
The CubeSea sample uses wallclock time from performance.now() which seems like a hack, and is also inaccurate. You don't base animations on the current time when your rendering starts. If your render loop starts late but still finishes in time for the next VSync equivalent, you're supposed to use that VSync's time, not the late start time. See also #13 .
Hello,
I have been trying to modify your code to play a 360-degree video but my effort has not been successful so far.
Could you provide any guideline to play the 360-degree video?
Trying to use any of the samples gives me a faded out "VR NOT FOUND" button in Chrome 83, Windows 10 with a Rift connected. I am in fact viewing the web page via Oculus Dash, so the headset is definitely working. I found some documentation indicating I needed to disable the XR sandbox flag and enable the Oculus flag in Chrome, but neither flag exists in Chrome it seems any longer, nor the latest version of Chromium.
Currently, I've not found the plane detection sample on WebXR proposal sample https://immersive-web.github.io/webxr-samples/proposals/, but I can test it using this https://storage.googleapis.com/chromium-webxr-test/r824623/proposals/phone-ar-plane-detection-anchors.html.
Has plan detection been removed from WebXR proposals page? Is plane detection API deprecated?
Thanks!
Win 10, Chrome ver 87, Oculus Rift S. inputSource.gamepad always reports 0 axes for both left and right controller. Buttons are working fine.
It would be nice to have an example that shows how different light sources could be implemented.
I am aware, that the lighting concepts from webGL apply here but I'm struggling to find information on how to work with the light.
As there is only one light pointing in one direction (down) in the different samples, I find it very hard to learn how to add additional light sources just from looking at the rendering files. The only thing I have figured out is how to manipulate the existing light.
why when i click on enter vr, the WMR open but not with the VR sample ?
Do i miss something, am new in vr dev and am learning, sorry if is a noob question!
I try run this with a samsung odyssey+
The debugging and app session work fine, but i can see sample in vr headset.
Any help or tips are welcome.
The hit-test example doesn't work for me on Chrome Canary version 83.0.4086.1
on Android 10 Samsung Galaxy S20 Ultra.
Is it still too early or should I toggle some flag?
Hi,
I work for a University in a innovation lab. We look into latest technologies, especially now focusing on AR.
I was fascinated by the Statue Demo that was demonstrated. I am relatively new to JS side of thing for I have a 3D animation and modelling background. I have played with your Library and have to say it is fascinating. I was more curious on the Annotation that was placed on the object. I am very keen to know if that particular demo is available for download from any git source. It will be really helpful for the project we are building for the University which involves the study of Anatomy. We have been racking our brains and trying to develop something similar which resulted in bad results
I would be much obliged if you can give me some hint or direction even to implement to get this working.
Thank you
Hi,
I'm using a Moto G6 Plus with ARCore and Chrome Canary 69.0.3476.0 (today's nightly) to try out the proposed handset examples here:
https://immersive-web.github.io/webxr-samples/proposals/
If I go directly to this url, both samples work faultlessly and actually the experience is pretty nice.
However, if I check out the repo and start a local web server (Node http-server) with the root webxr-samples/, then I visit the local url (ip_addr:8080/webxr-samples/proposals/phone-ar.html) with the same handset it says "AR NOT FOUND" on the button and I can't start either AR experience.
I wondered if there were any dependencies missing from the repo but unfortunately it is hard to spider the pages because links are created in javascript and wget won't follow those.
Apologies if I have missed any documentation, but I haven't been able to find anything relevant. Is there something else I need to do to be able to get the samples to run from a local web server?
Thanks
according to https://github.com/immersive-web/webxr-samples/blob/master/js/render/core/material.js
MaterialUniform has a getter and a setter.
Looking at input-selection.html
as an example in onSqueeze()
uniforms.baseColorFactor.value = [Math.random(), Math.random(), Math.random(), 1.0];
console.log("random colour:")
console.log(uniforms.baseColorFactor.value)
uniforms.baseColorFactor.value
is undefined
What is the correct getter method for a baseColorFactor
?
I've found _value
but that seems like it should be hidden behind a getter method.
I downloaded the Github code, unzip, and try to run any of the samples. It doesn't show "Enter VR". However when I run the code from the same browser directly from the github.io, it works. When I download the zip file from the github, do I have to run anything to be able to see "Enter VR"? I am using the same browser. One is the code that is local. One is directly from Github.
Thanks!
The hit test in the phone-ar-hit-test.html proposal seems to be broken. Following Error occurs:
Uncaught (in promise) TypeError: Failed to execute 'requestHitTest' on 'XRSession': parameter 1 is not of type 'XRRay'.
at XRSession.onXRFrame (phone-ar-hit-test.html:214)
Would love to help update the link to example 17 but can't seem to find it anywhere? Has this example been removed?
One of the great advantages of AR is that one is able to extract information from the background in order to perform Computer Vision operations such as feature detection, object tracking, markers etc.
So far, I've been able to identify 3 possible ways to accomplish this, yet none seem to present itself as the way to go:
onXRFrame(time, frame) {
let session = frame.session;
...
session.renderState.baseLayer.context.readPixels(...)
...
}
This yields the correct data, which is the full background imagery without any superimposed 3D models, although cripples performance due to its blocking nature.
c = document.createElement('canvas')
ctx = c.getContext('2d');
...
onXRFrame(time, frame) {
let session = frame.session;
...
ctx.drawImage(session.renderState.baseLayer.context.canvas, 0, 0)
...
}
Performance is relatively better than in option 1, although performing ctx.getImageData(...)
directly after yields an array with only zeroes. This is usually because the preserveDrawingBuffer
option is set to the default false
, however I have verified this was not the case in every attempt I've made trying to get this to work. Also, that fact that readPixels() works and this does not is quite baffling to me, as they should be reading from the same source, no?
...
let offscreenCanvas = new OffscreenCanvas(...);
let gl = offscreenCanvas.getContext( 'webgl', { xrCompatible: true });
...
session.updateRenderState({ baseLayer: new XRWebGLLayer(session, gl) });
...
onXRFrame(time, frame) {
let session = frame.session;
...
session.renderState.baseLayer.context.canvas.transferToImageBitmap();
...
}
Haven't thoroughly tested this setup for performance, yet my first impression is that it definitely beats option 1. Unfortunately it shares the same issue as option 2: The image is completely blank, no data seems to have been transferred with it.
Given that my goal should not be completely alien to the application of AR, I was wondering if and how I am supposed to retrieve the imagery from the camera captured in AR (in my case legacy-inline-ar
) mode in a performant manner. Am I looking in the wrong place? Am I missing something? Given the multitude of samples in this repo, I was hoping to get some insight from folks who are more experienced with this API.
Would you accept a pull request which alert
s the user in js/util/webxr-button.js
if their device is unsupported? Currently the button just shakes, and to see the error you need to plug your phone in and visit chrome://inspect
(or equivalent) to see the error message. Would be handy to tell the user what's going on in a modal or alert
.
https://github.com/immersive-web/webxr-samples/blob/master/js/util/webxr-button.js#L483
It seems like sometime within the last month, the inline-session example stopped supporting Magic Window mode, and the version without the polyfill seems to have stopped working entirely. The immersive-VR mode works as expected.
When attempting to view the sample without the polyfill, the debug console registers an issue about xr.isSessionSupported
not being a function.
We have an app which makes use of WebXR, and in my investigation there, it seems like as of recently the local
reference space no longer seems to be supported by the Pixel 2 (or we're now requesting it incorrectly), which may or may not be related.
Apologies for the rather messy issue description, but I've been having a bit of a time the last short while trying to figure out how to get Magic Window working in our app again.
Behavior:
What are the requirements for these samples?
Currently only displays external
prob, which also is going to be removed :)
webxr-samples/xr-device-enumeration.html
Line 91 in a4eeb96
Please see https://immersiveweb.dev/chrome-support.html
Originally posted by @klausw in immersive-web/webxr#964 (comment)
My chrome version is this (com.android.chrome_79.0.3945.79-394507905_minAPI19_maxAPI23 (armeabi-v7a) (nodpi) _apkmirror.com.apk), my Android system is 9.0, the mobile platform is OnePlus7 (supports ARCore), I have opened all XR items in chrome://flags, and the homepage of this link (https://immersive-web.github.io/webxr-samples/) also shows that my browser supports WebXR, everything It's all so beautiful. But when I click immersive ar session to enter the page and click start AR, the console prompts an error: "XRSession creation failed: The specified session configuration is not supported."!
why? Please give guidance, thank you.
Hi,
On Chrome Canary (82.0.43083.0), Android 9, LG-G810.
The Barebones WebXR DOM Overlay doesn't work for me, I am not sure what I am supposed to be seeing from the description. I can press the enter AR button but then the browser locks as if something is rendered on top of it. I see nothing happening further, no square or camera.
I also allowed Augmented Reality in the site settings, clear/reset didn't help.
Also, on a sidenote, are there any ground plane examples? Or is this not possible right now?
First of all thanks a lo for the sample applications.
I'm trying to incorporate THREE.js as render engine. I tried this based on this proposal
As stated in the webxr-explainer, if have to set the sessions baseLayer
via the sessions updateRenderState
method.
Now when I try to set the sessions baseLayer
, it remains null. There is no error thrown which would indicate that session.updateRenderState
didn't work.
Here is the code which is executed when a immersive-ar
session is successfully started:
onSessionStarted = async session => {
session.addEventListener('end', this.onSessionEnded);
this.THREEjsRenderer = new THREE.WebGLRenderer({
...
});
this.THREEjsRenderer.autoClear = false;
this.THREEjsRenderer.shadowMap.enabled = true;
this.THREEjsGL = this.THREEjsRenderer.getContext();
await this.THREEjsGL.makeXRCompatible();
this.xrRefSpace = await session.requestReferenceSpace('local');
session.updateRenderState({ baseLayer: new XRWebGLLayer(session, this.THREEjsGL) });
const {framebuffer} = session.renderState.baseLayer; // <--- XRSession creation failed: Cannot read property 'framebuffer' of null
this.THREEjsCamera = new THREE.PerspectiveCamera();
this.THREEjsCamera.matrixAutoUpdate = false;
session.requestAnimationFrame(this.onXRFrame);
};
Am I missing something important? Or may that be browser specific? I'm using Chorme Canary 79..0.3939.0
scene.js does its own timing logic based on performance.now() timestamps.
If I understand it right, it's recommended for applications to use the timestamp supplied to their rAF callback, since this value can be tuned to match the expected time-on-screen for the computed frame with the goal of providing smooth animations. Using a wallclock time bypasses these adjustments, and also isn't helpful for the purpose of testing WebXR implementations.
(I ran into this when I accidentally broke the rAF callback timestamp, making it permanently zero, but initially didn't notice since the CubeSea demo continued animating as expected.)
(chrome mobile 88 on oneplus 6T, same result on girlfriend's oneplus 5, and same result efore updating chrome and android)
As the title says, the AR feature seems broken. It seems to work for about a second, then all 3d content disappears. However, if I click the 'run without polyfill', it all works very well and stays there.
referring to:
https://immersive-web.github.io/webxr-samples/immersive-ar-session.html
vs
https://immersive-web.github.io/webxr-samples/immersive-ar-session.html?usePolyfill=0
Tested on Firefox + Oculus Quest using Virtual Desktop.
VR mode works fine on start, but when clicking on "Enable spectator mode" the view on the headset is stuck, while on the desktop the inline view works as it should (including the headset model that indicates the real headset pose).
Dear Team,
in order to move my fisrt step into WebXr I decide to copy the code into a plain vannila NGINX server running in my Windows 10 Host.
I'm able to reach index.html and all the other file using http://MyIP/webxr-samples/ , but it can recognize my Frefox on Quest .
I dont have any issue with the path https://immersive-web.github.io/webxr-samples/
I hope it is a docker/nginx configuration.
Any suggestion, help ?
Thanks , Fabio
Samsung Galaxy Note 9 (SM-N960F)
OS: Android 8.1.0
Chrome: 71.0.3578.99
Chrome Canary: 73.0.3672.0
Chrome Dev: 73.0.3667.2
Steps to reproduce the problem:
What is the expected behavior?
AR mode should start with live camera view in background.
What went wrong?
AR mode does not start. Background doesn't show live camera view.
Console:
Chrome Dev and Chrome Canary
Uncaught (in promise) DOMException: Failed to construct 'XRWebGLLayer': This context is not marked as XR compatible. at onSessionStarted (https://immersive-web.github.io/webxr-samples/proposals/phone-ar.html?allowPolyfill=1:150:29) at https://immersive-web.github.io/webxr-samples/proposals/phone-ar.html?allowPolyfill=1:133:15
Chrome
phone-ar.html?allowPolyfill=1:1 Uncaught (in promise) DOMException
Note:
When running on dev server my own AR page without polyfill I get:
Chrome Dev and Chrome Canary
Uncaught (in promise) TypeError: this.gl.setCompatibleXRDevice is not a function
this.gl = this.renderer.getContext(); await this.gl.setCompatibleXRDevice(this.session.device); this.session.baseLayer = new XRWebGLLayer(this.session, this.gl);
Steps to reproduce the problem:
What is the expected behavior?
VR content should be displayed in background. User can select an object and change it color.
What went wrong?
VR content isn't displayed in background.
Console:
Chrome Dev and Chrome Canary
Uncaught (in promise) DOMException: Failed to construct 'XRWebGLLayer': This context is not marked as XR compatible. at onSessionStarted (https://immersive-web.github.io/webxr-samples/input-selection.html?allowPolyfill=1:190:29) at https://immersive-web.github.io/webxr-samples/input-selection.html?allowPolyfill=1:112:19
Chrome
It works! VR content is displayed in background. User can select an object and change it color.
I'm fairly new to the web development and don't have access to any other ARCore compatible device right now, so I'm not sure if this behavior is related to the device itself or unstable WebXR Device API.
Does anyone have an idea how to construct 'XRWebGLLayer' (mark context as XR compatible)?
I’ve really enjoyed going through the examples and am hopeful that WebXR is an intuitive option for developing VR experiences.
However I’m finding it difficult to get my own glft files to load.
Through trial and error I’ve found that microsofts 3d builder application is unreliable for exporting to the glft file format, and I have found that using blender I can manipulate the files with the content that is already there. For example with the immersive-vr.html example I can use Blender to copy scene and I did one copy above and one copy below. This I was able to export and was able to load into the example. However if I try to load a different model, it doesn’t load. Even if I import one of my models, for example a spaceship into the space scene, the spaceship is never visible
Is there a simplistic tutorial that goes through making the models and importing and loading?
I would like to build one that starts from complete scratch, create a model in blender, exporting to the right format, and then build on from there.
I’ve also learn how to animate the model in blender, and it would be nice to have a simple getting started tutorial in importing and loading an animated glb file into a webxr scene.
I have tried this as the examples are already with no success. for some reason, only the models that come with these samples seem to work.
Is there a tutorial on designing exporting importing and loading a 3d model?
The changes to accommodate the new spec revisions in Canary made their way to stable (m73). I was unable to fix the samples following current spec: https://immersive-web.github.io/webxr/
I saw discrepancies between spec and Chrome implementation. For instance, the xrSession.updateRenderState
is not defined
Are there any docs with Chrome current implementation or any samples you can point me too?
I'd love to make a web page to share photos I've taken with my 180x180 stereo camera using WebXR, so I've been trying to adapt the 360 photo example for this use case.
I was able to add support for stereoRightLeft, since I've reversed some of my 180 L/R images to allow for cross-eye viewing, but my attempts to stretch the image across only one hemisphere have failed.
Could someone who groks the vertex math here rig up an extra mode for this in skybox.js? I'd be more than happy to provide a test image under the same license.
It would be particularly nice if there was a lean example that rendered a very simple 3D model.
I terribly sorry if this is not the correct place for this question, but I don't know where else to turn. I would like to play with the shiny new toy called WebXR but is doesn't seem to be working in Chome 67.
When I run the following command
navigator.xr.requestDevice()
in the console of chrome (Version 67.0.3396.62 (Official Build) (64-bit)) developer tools (after enabling ALL WebXR flags under chrome://flags and restarting the browser) I receive this error.
Uncaught (in promise) DOMException: No devices found.
This sample code also didn't work... https://github.com/immersive-web/webxr-samples/blob/master/xr-barebones.html
I tried this on a desktop and mobile Chrome browser.
Why is the WebXR API not working in Chrome 67?
Thank you
I also posted this question here (https://stackoverflow.com/questions/50623074/webxr-not-working-in-chrome-67-with-all-xr-flags-enabled) on Stackoverflow.
I've gotten a report from testers at Google that in the audio sample if the audio is paused and then resumed the "bouncing" animation that the speakers have that's synced with the audio fails to start again. Appears to happen on any device used to test.
Using Huawei P20 I can't run any of the immersive-ar samples.
I can click "Start AR" but when I do the following happens:
Chrome versions tested: Chrome Beta (81), Dev (82) and Canary (82) all with the same result. No extra flags added, but should not be needed with these versions?
Tested with remote debugging and no console errors show.
The immersive-vr samples work fine.
Is this a result of yet new breaking changes in the webXR spec or something specific with my phone?
Why does the samples don't work on Chrome on PC, using the Oculus? (all flags are enabled)
On Mozilla, using the Oculus on PC, some of them work, but with 19 - 20 fps, and only if i start the application with Polyfill. Why that? if it should works also without Polyfill.
So why all the application work good on smartphone, but no on PC with Oculus?
Modify cube sea texture: WebVR text => WebXR?
https://github.com/immersive-web/webxr-samples/blob/master/media/textures/cube-sea.png
The isMobile
function here is not defined:
webxr-samples/js/webxr-version-shim.js
Lines 138 to 144 in 82f8fbb
There is a this._isMobile()
check in the parent scope - so I guess on Desktop the function will not even exist.
Side question: would it perhaps make sense to publish this shim as part of the webxr-polyfill? For the moment I'm copying it to my code.
Hi to all,
I tried the sunflower example and it works, but if I replace the .gltf file with another file but with extension *.glb it doesn't works. The glb file used is version 2.
Thank you :)
Title, i do not know what is wrong but it can take up to 2-3 mins for it to start working.
Using a samsung galaxy 7
re: using webxr-samples/hit-test.html
Is there some sort of special specifications one needs to swap another gltf file?
only works with "sunflower.gltf "
size, scale, vert ?
Thanks
Running the hit-box sample in Chrome 79 throws the following error:
The provided value 'immersive-ar' is not a valid enum value of type XRSessionMode.
On Safari / IOS 13 no error is thrown but
navigator.xr.isSessionSupported('immersive-ar')
returns false
I tried both systems with and without using the polyfill.
Is this the expected behavior?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.