Thank you for helping us improve the quality of Unity Documentation. Although we cannot accept all submissions, we do read each suggested change from our users and will make updates where applicable.Close
For some reason your suggested change could not be submitted. Please <a>try again</a> in a few minutes. And thank you for taking the time to help us improve the quality of Unity Documentation.Close
The VR module implements support for virtual reality devices in Unity.
|Manager class with API for recognizing user gestures.
|The Holographic Remoting interface allows you to connect an application to a remote holographic device, and stream data between the application and that device.
|The Holographic Settings contain functions which effect the performance and presentation of Holograms on Windows Holographic platforms.
|Provides access to user input from hands, controllers, and system voice commands.
|Captures a photo from the web camera and stores it in memory or on disk.
|Contains information captured from the web camera.
|SurfaceObserver is the main API portal for spatial mapping functionality in Unity.
|Records a video from the web camera directly to disk.
|Contains general information about the current state of the web camera.
|The WorldAnchor component allows a GameObject's position to be locked in physical space.
|The storage object for persisted WorldAnchors.
|A batch of WorldAnchors which can be exported and imported between apps.
|This class represents the state of the real world tracking system.
|Contains all functionality related to a XR device.
|Global XR related settings.
|Timing and other statistics from the XR subsystem.
|When calling PhotoCapture.StartPhotoModeAsync, you must pass in a CameraParameters object that contains the various settings that the web camera will use.
|Contains fields that are relevant during an error event.
|Contains fields that are relevant when a user cancels a hold gesture.
|Contains fields that are relevant when a hold gesture completes.
|Contains fields that are relevant when a hold gesture starts.
|Represents one detected instance of an interaction source (hand, controller, or user's voice) that can cause interactions and gestures.
|Contains fields that are relevent when an interaction source is detected.
|Contains fields that are relevent when an interaction source is lost.
|Pose data of the interaction source at the time of either the gesture or interaction.
|Contains fields that are relevent when an interaction source enters the pressed state for one of its buttons.
|Represents the set of properties available to explore the current state of a hand or controller.
|Contains fields that are relevent when an interaction source exits the pressed state for one of its buttons.
|Represents a snapshot of the state of a spatial interaction source (hand, voice or controller) at a given time.
|Contains fields that are relevent when an interaction source updates.
|Contains fields that are relevant when a manipulation gesture is canceled.
|Contains fields that are relevant when a manipulation gesture completes.
|Contains fields relevant when a manipulation gesture starts.
|Contains fields that are relevant when a manipulation gesture gets updated.
|Contains fields that are relevant when a navigation gesture is canceled.
|Contains fields that are relevant when a navigation gesture completes.
|Contains fields that are relevant when a navigation gesture starts.
|Contains fields that are relevant when a navigation gesture updates.
|Contains fields that are relevant when recognition of a gesture event ends.
|Contains fields that are relevant when recognition of a gesture event begins.
|SurfaceData is a container struct used for requesting baked spatial mapping data and receiving that data once baked.
|SurfaceId is a structure wrapping the unique ID used to denote Surfaces. SurfaceIds are provided through the onSurfaceChanged callback in Update and returned after a RequestMeshAsync call has completed. SurfaceIds are guaranteed to be unique though Surfaces are sometimes replaced with a new Surface in the same location with a different ID.
|Contains fields that are relevant when a tap gesture occurs.
|The encoded image or video pixel format to use for PhotoCapture and VideoCapture.
|Enumeration of available modes for XR rendering in the Game view or in the main window on a host PC. XR rendering only occurs when the Unity Editor is in Play Mode.
|This enumeration represents the set of gestures that may be recognized by GestureRecognizer.
|Enum indicating the reason why connection to remote device has failed.
|Current state of the holographis streamer remote connection.
|Denotes which hand was used as the input source.
|Specifies the kind of an interaction source.
|Specifies which part of the controller to query pose information for.
|Denotes the accuracy of tracking on the interaction source.
|The type of button or controller feature pressed, if any.
|Image Encoding Format.
|Indicates the lifecycle state of the device's spatial location system.
|This enum represents the result of a WorldAnchorTransferBatch operation.
|Enumeration of the different types of SurfaceChange events.
|Represents how the device is reporting pose data.
|Represents the size of physical space available for XR.
|Represents the current user presence state detected by the device.
|Describes the active mode of the Web Camera resource.
Did you find this page useful? Please give it a rating: