Reality Mixer JS
This is a small Mixed Reality Capture module for WebXR + Three.js.
You can use this module to allow users to record your WebXR experiences in Mixed Reality.
Unlike the original Reality Mixer or LIV, the Mixed Reality composition is happening on the browser itself. However, users will need another external application to record their browser (e.g. OBS).
Keep in mind that this is still just a prototype and that I'm not a frontend developer. Feel free to open PRs and contribute to the project.
YouTube video
Screenshots
Example | Screenshot |
---|
webxr_vr_ballshooter | |
webxr_vr_paint | |
Requirements
- PC VR headset that's compatible with WebXR (e.g. Oculus Rift S).
- Google Chrome.
- Webcam and a green screen. Alternatively, you can use your iPhone/iPad as a camera (without a green screen) with the LIV Camera app and NDI Webcam Input.
How to test the example
-
Clone this repository.
-
Run npm ci
to download the dependencies.
-
Run http-server
to start the HTTP server (that can be downloaded by running npm install -g http-server
).
-
WebXR and navigator.mediaDevices
require HTTPS (unless you're accessing it via localhost
). You could use a tool like localtunnel for testing. You can run npm install -g localtunnel
to download it and then you can run lt --port 8080 --subdomin 127.0.0.1
in a separate terminal.
-
Open your browser and navigate to https://{your HTTPS domain}/examples/webxr_vr_ballshooter.html
(or https://127.0.0.1:8080/examples/webxr_vr_ballshooter.html
)
Your browser will ask for permission to access your camera, and it'll ask for permission to use your VR headset once you click on the WebXR button.
You'll need to complete the calibration before starting the example, and you'll need to recalibrate whenever you change your guardian boundary / play area, change the position and orientation of your camera, or change your green screen.
API
import * as THREE from 'three';
import * as MRC from 'reality-mixer';
let mixedRealityCapture;
let scene, renderer, camera;
const cameraCalibration = new MRC.CameraCalibration(
1920,
1080,
38,
[0, 1.5, 0],
[0, 0, 0, 1]
);
const chromaKey = new MRC.ChromaKey(
[0, 1, 0],
0.25,
0,
[0, 0, 0, 0]
);
const calibration = new MRC.Calibration(
cameraCalibration,
chromaKey,
4,
);
renderer.domElement.style.display = "none";
mixedRealityCapture = new MRC.MixedRealityCapture( calibration );
document.body.appendChild( mixedRealityCapture.domElement );
mixedRealityCapture.onWindowResize();
renderer.render( scene, camera );
mixedRealityCapture.render( renderer.xr, scene );
Alternatively, you can instantiate the calibration with a JSON provided by the user:
const json = `
{
"schemaVersion": 1,
"camera": {
"width": 1280,
"height": 720,
"fov": 38,
"position": [0, 1.5, 0],
"orientation": [0, 0, 0, 1]
},
"chromaKey": {
"color": [0, 1, 0],
"similarity": 0.25,
"smoothness": 0,
"crop": [0, 0, 0, 0]
},
"delay": 4
}
`;
const calibrationData = JSON.parse( json );
const calibration = MRC.Calibration.fromData( calibrationData );
mixedRealityCapture = new MRC.MixedRealityCapture( calibration );
TO-DOs
-
Continue iterating on the Calibration (fixes, delay, adustments, etc).
-
Create a static website to host the examples.