Construct and Stream Browser-Based XR Experiences with NVIDIA CloudXR.js

-


Delivering high-fidelity VR and AR experiences to enterprise users has typically required native application development, custom device management, and complicated deployment pipelines. Now, with the brand new JavaScript SDK NVIDIA CloudXR.js, developers can stream GPU-rendered immersive content on to a typical web browser—no app store, no installs, no device-specific builds.

NVIDIA CloudXR.js brings the total power of NVIDIA RTX distant rendering to the online platform. This can be a fundamental shift in how immersive applications are built and delivered. NVIDIA CloudXR.js expands access to enterprise XR beyond native development workflows and into the broad web developer community. 

Developers constructing digital twins in NVIDIA Omniverse, robot teleoperation systems, or interactive 3D training environments can now reach users on XR headsets through a URL. This post walks through the SDK architecture, its core API, and methods to connect it to server applications reminiscent of Omniverse, NVIDIA Isaac Lab, and LÖVR.

Tips on how to run and host an NVIDIA CloudXR.js client

This section explains methods to run and host an NVIDIA CloudXR.js web client, including prerequisites and an architecture overview.

Prerequisites

To run and host a CloudXR.js-based web client, you wish a pc with Node.js v20 or higher and npm installed. Take a look at the NVIDIA CloudXR.js Sample Client to try it without constructing the online client. You’ll still need an NVIDIA CloudXR server running CloudXR Runtime and an OpenXR-compatible app. Prerequisites include:

  • Client side
    • Node.js v20 or higher and npm
    • A WebGL2 and WebXR-compatible browser (for instance, Meta Quest Browser or Pico Browser on headsets, or any browser that runs an emulator)
    • Familiarity with JavaScript/TypeScript and basic WebGL concepts
    • A Meta Quest 2/3/3s (OS v79+) or Pico 4 Ultra (Pico OS 15.4.4U+) for headset testing
  • Server side
    • An NVIDIA GPU-equipped server with the newest requirements running NVIDIA CloudXR Runtime
    • An OpenXR-compatible server application reminiscent of Omniverse, Isaac Lab, or LÖVR
  • WiFi 6 or 6E network with <20 ms latency and 100+ Mbps bandwidth are highly beneficial

CloudXR.js architecture

CloudXR.js uses a two-tier connection model that separates web application hosting from the XR streaming pipeline (Figure 1).

Client web server: A regular Node.js development server (or any static hosting solution) serves the online application to client devices over HTTP or HTTPS.

CloudXR Runtime connection: The SDK establishes a WebSocket connection from the browser to the CloudXR Runtime running on the server. This channel carries the WebRTC-based video stream, pose tracking data, and controller/hand input.

On the server side, the CloudXR Runtime pairs with any OpenXR-compatible application. It captures rendered stereo frames, encodes them using hardware-accelerated AV1, H.265, or H.264, and streams them to the client. The client decodes the video, composites it into the WebXR framebuffer, and sends tracking data back to the server, closing the loop at as much as 120 frames per second.

The net application itself is framework agnostic. CloudXR.js integrates with vanilla WebGL, React Three Fiber, or another WebXR-compatible library.

Install the SDK

So as to add CloudXR.js to a project, download the SDK from NVIDIA NGC. Then import it into your project.

The package includes TypeScript type definitions, so IDE autocompletion and kind checking work out of the box. Execute this on a typical webpage.

Create a streaming session

Your complete SDK surface centers on a single entry point: createSession. Import it, configure the connection and rendering parameters, and pass optional delegate callbacks to handle lifecycle events. The next code snippet is for the core pieces around using the createSession API.

// Basic session creation
const session = createSession({
 serverAddress: '192.168.1.100',
 serverPort: 49100,
 useSecureConnection: false,
 perEyeWidth: 2048,
 perEyeHeight: 1792,
 // from WebGl API
 gl: webglContext, 
// from WebXR API
 referenceSpace: xrReferenceSpace
});

// With event delegates
const session = createSession(sessionOptions, {
 onStreamStarted: () => {
   console.info('CloudXR streaming began');
 },
 onStreamStopped: (error) => {
   if (error) {
     console.error('Streaming error:', error);
   } else {
     console.info('Streaming stopped normally');
   }
 }
});

// Hook up with CloudXR Runtime
if (session.connect()) {
 console.info('Connection initiated');
}

The perEyeWidth and perEyeHeight values (which should be multiples of 16) define the resolution of every eye’s view. The SDK routinely derives the total stream resolution from these values.

Integrate the render loop

Once connected, drive the streaming pipeline out of your WebXR render loop. Each frame requires two calls: send the present tracking state to the server, then render the received frame.

function onXRFrame(time: number, frame: XRFrame) {
  // Send head pose, controllers, and hand tracking to the server
  session.sendTrackingStateToServer(time, frame);

  // Render the streamed frame into the WebXR layer
  session.render(time, frame, xrSession.renderState.baseLayer);
  // Proceed the loop
  xrSession.requestAnimationFrame(onXRFrame);
}
xrSession.requestAnimationFrame(onXRFrame);

The sendTrackingStateToServer method routinely captures and transmits controller button presses, trigger values, controller poses, hand tracking data (when supported by the device), and the viewer’s head pose. The server uses this data to render the following frame from the right viewpoint with the right input state. The render method then composites the decoded video into the XR display.

Explore sample clients

CloudXR.js offers two sample clients that exhibit different integration approaches.

WebGL sample

The WebGL sample (easy/) provides a minimal, single-file TypeScript implementation that works directly with the WebXR and WebGL2 APIs. It features a connection configuration UI, browser capability checks, and an easy render loop. That is the fastest path to understanding how CloudXR.js works under the hood.

React sample

The React sample (react/) demonstrates a production-style architecture using React Three Fiber, React Three XR, and React Three UIKit. It features component-based session management, a dual UI system (2D HTML for configuration and 3D in-VR panels for interaction), and WebGL state tracking to forestall rendering conflicts between React Three Fiber and CloudXR.

Each samples support Docker deployment for quick testing:

# WebGL sample
docker construct -t cloudxr-js-sample --build-arg EXAMPLE_NAME=easy .
docker run -d --name cloudxr-js-sample -p 8080:80 -p 8443:443 cloudxr-js-sample
# React sample
docker construct -t cloudxr-react-sample --build-arg EXAMPLE_NAME=react .
docker run -d --name cloudxr-react-sample -p 8080:80 -p 8443:443 cloudxr-react-sample

For local development with hot reloading, install dependencies and begin the dev server:

cd easy  # or react
npm install 

Navigate to http://localhost:8080 in your browser. On desktop, the Immersive Web Emulator Runtime (IWER) routinely loads to emulate a Meta Quest 3, so you possibly can develop and test and not using a physical headset.

Hook up with server applications

CloudXR.js works with any OpenXR-compatible application running alongside the CloudXR Runtime. The next three examples exhibit the breadth of what you possibly can stream: NVIDIA Omniverse, NVIDIA Isaac Lab, and LÖVR.

NVIDIA Omniverse

Stream high-fidelity USD digital twin scenes to XR headsets for architecture walkthroughs, design reviews, and industrial training. Omniverse Kit SDK 109.0.2 and later includes an integrated CloudXR WebRTC runtime, so the streaming pipeline is built directly into the platform. Operators interact with 3D content using hand tracking for direct manipulation inside the streamed environment.

NVIDIA Isaac Lab

Construct teleoperation workflows for dexterous robots. An operator wearing a Quest 2/3/3s or Pico 4 Ultra sees a real-time stereo rendering of the robot simulation and uses hand tracking to regulate the robot. Isaac Lab runs on Linux with Docker and the NVIDIA Container Toolkit, supporting dual-GPU configurations for max performance. For details, see Organising CloudXR Teleoperation.

LÖVR

LÖVR is a light-weight, open source Lua-based VR framework that gives a straightforward path to a working server. Launch LÖVR with the --webrtc flag and connect a CloudXR.js client. This is good for rapid prototyping and testing your client setup. Visit NVIDIA/cloudxr-lovr-sample on GitHub to start.

Configure networking toward production

For production deployments or HTTPS hosting, arrange a WebSocket proxy with TLS termination. CloudXR.js features a sample for Docker-based HAProxy configuration that handles this routinely:

cd proxy
docker construct -t cloudxr-wss-proxy .
docker run -d --name wss-proxy --network host 
  -e BACKEND_HOST=localhost 
  -e BACKEND_PORT=49100 
  -e PROXY_PORT=48322 
  cloudxr-wss-proxy

The proxy generates self-signed certificates, connects to the CloudXR Runtime on port 49100, and listens for secure WebSocket connections on port 48322. For enterprise Kubernetes deployments, the SDK documentation includes an NGINX Ingress configuration that supports multiple CloudXR servers with load balancing.

Ensure your firewall allows TCP port 49100 (signaling), UDP port 47998 (media streaming), and TCP port 48322 (WSS proxy, if using HTTPS).

Start with CloudXR.js

NVIDIA CloudXR.js brings enterprise XR streaming to the online platform—GPU-rendered immersive content delivered through a URL, with no native app required. The SDK provides a clean, minimal API that integrates with any WebXR-compatible framework and supports multiple server applications, from Omniverse digital twins to Isaac Lab robot teleoperation. It includes the networking and performance tooling needed for production deployments.

By providing every web developer the tools to construct and ship immersive experiences without the overhead of native XR development, CloudXR.js makes possible entirely recent categories of applications. We’re excited to see what the developer community builds with this recent SDK.

Download CloudXR.js and try the sample clients. For the fastest path to a working demo, start with the LÖVR sample server and the WebGL client. For the whole API reference, configuration guides, and deployment documentation, visit the CloudXR.js SDK documentation.



Source link

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x