Version: 2023.1
言語: 日本語
AR development in Unity
XR architecture

VR development in Unity

VR development shares common workflows and design considerations with any real-time 3D development in Unity. However, distinguishing factors include:

  • Richer user input: in addition to “traditional” button and joystick controllers, VR devices provide spatial head, controller, and (in some cases) hand and finger tracking.
  • More “intimate” interaction with the environment: in conjunction with the possibilities of richer input, VR raises the expectations of much closer and “physical” interaction with the environment than typical 3D games and applications. Users expect to be able to pick things up and interact with objects in the environment. With head tracking, the camera can get much closer to the walls and other boundaries of the environment – even passing through them.
  • User comfort concerns: many people experience motion sickness in VR when camera movement doesn’t match the movement of their head. You can mitigate the causes of motion sickness by maintaining a high frame rate, offering a range of locomotion options so that users can choose a mode they are comfortable with, and avoiding moving the camera independently of the user’s head tracking.

To get started with VR development, use the XR Plug-in Management system to install and enable XR provider plug-ins for the devices you want to support. See XR Project set up for more information.

Basic VR scene elements

A basic VR scene should contain an XR Origin, which defines the 3D origin for tracking data. This collection of GameObjects and components also contains the main scene Camera and the GameObjects representing the user’s controllers. See Set up an XR scene for instructions on setting up a basic VR scene.

Beyond the basics, you typically need a way for the user to move around and to interact with the 3D world you have created. The XR Interaction Toolkit provides components for creating interactions like selecting and grabbing objects. It also provides a customizable locomotion system. You can use the Input System in addition to or instead of the XR Interaction Toolkit.

VR packages

Most of the features and APIs used for VR development in Unity are provided through packages. These packages include:

VR provider plug-ins

To build VR apps in Unity, use the XR Plug-in Management system to add and enable provider plug-ins for the devices you want to support. See XR Project set up for instructions.

The VR provider plug-ins supported by Unity include:

  • Oculus for Oculus Rift, Meta Quest 2, and Quest Pro.
  • OpenXR for any device with an OpenXR runtime, including Meta headsets, Vive headsets, Valve SteamVR, HoloLens, Windows Mixed Reality, and others.
  • PlayStation VR (available to registered PlayStation developers) for Sony PS VR and PS VR2 devices. See PlayStation Partners for more information.
  • Mock HMD for simulating a VR headset in the Unity Editor Play mode view.

Note: Many headset makers are working toward using the OpenXR runtime as a standard. However, this process is not complete and there can be feature discrepancies between OpenXR and a headset maker’s own provider plug-in or SDK

XR Interaction Toolkit

The XR Interaction Toolkit can make it easier and faster to develop VR applications. The XR Interaction Toolkit provides:

XR Core Utilities

The XR Core Utilities package contains software utilities used by other Unity XR plug-ins and packages. Typically, this package gets installed in your project as a dependency of other XR packages.

Input System

The Unity Input System package not only supports accessing user input from VR controller buttons and joysticks, but also provides access to XR tracking data and haptics. The Input System package is required if you use the XR Interaction Toolkit or the OpenXR provider plug-in.

VR template

Unity’s VR Project Template provides a starting point for virtual reality development in Unity. The template configures project settings, pre-installs the right packages, and includes a sample scene with various pre-configured example assets to demonstrate how to set up a project that is ready for VR. Access the VR template through the Unity Hub when you create a new project. Refer to Create a new project for information about creating a project with the template.

For more information about the template assets and how the sample scene is set up, refer to About the VR Project Template.

Hand tracking

Hand tracking is a feature that allows users to interact with a VR application using their hands. Hand tracking is supported by the XR Hands package.

The Hands package provides:

  • A standard hand data model.
  • An API for accessing hand tracking data.
  • The XR Hand Skeleton Driver component, which maps a set of Transforms to their corresponding hand joints and updates those Transforms as tracking data is received.
  • The XR Hand Mesh Controller, which enables and disables a mesh as hand tracking is acquired or lost.
  • A HandVisualizer sample that demonstrates how to use the hand tracking API.
AR development in Unity
XR architecture
Copyright © 2023 Unity Technologies
优美缔软件(上海)有限公司 版权所有
"Unity"、Unity 徽标及其他 Unity 商标是 Unity Technologies 或其附属机构在美国及其他地区的商标或注册商标。其他名称或品牌是其各自所有者的商标。
公安部备案号:
31010902002961