Version: 2019.2
空間マッピングのよくあるトラブルシューティングの問題
Windows Mixed Reality の入力

Unity の XR 入力

This section provides information on all Unity-supported input devices for virtual reality, augmented reality and Windows Mixed Reality applications. This page is broken up into the following sections:

XR platforms typically provide a rich diversity of input features for you to take advantage of when designing user interactions. Positions, rotations, touch, buttons, joysticks, and finger sensors all provide specific pieces of data.

At the same time, access to these input features vary between different XR platforms. The Vive and the Oculus Rift have subtle differences, while there are drastic differences between a desktop VR platform and a mobile platform like Daydream.

Unity provides a C# struct called InputFeatureUsage, which defines a standard set of physical device elements (such as buttons and triggers) to access user input in a platform-agnostic way. These help you quickly identify input types by name. See XR.Input.CommonUsages for a definition of each InputFeatureUsage.

Each InputFeatureUsage corresponds to a common input action or type. For example, Unity defines the InputFeatureUsage called trigger as a single-axis input controlled by the index finger, regardless of which XR platform you use. You can use InputFeatureUsage to get the trigger state by name, so you don’t need to set up an axis (or a button on some XR platforms) for the conventional Unity Input system.

XR 入力マッピング

以下の表は、標準コントローラーの InputFeatureUsage の名前と、それらが一般的な XR システムのコントローラーにどのようにマップされるかを示しています。

InputFeatureUsage FeatureType Legacy Input Index [L/R] WMR Oculus GearVR Daydream OpenVR (全般) Vive OpenVR
(
Oculus)
OpenVR (WMR)
primary2DAxis 2D 軸 [(1,2)/(4,5)] ジョイスティック ジョイスティック ジョイスティック タッチパッド [トラックパッド/ジョイスティック] トラックパッド ジョイスティック ジョイスティック
trigger [9/10] トリガー トリガー トリガー トリガー トリガー トリガー トリガー トリガー
grip [11/12] グリップ グリップ グリップ グリップ グリップ グリップ グリップ
indexTouch [13/14] 人差し指 - 接近
thumbTouch [15/16] 親指 - 接近
secondary2DAxis 2D 軸 [(17,18)/(19,20)] タッチパッド タッチパッド
indexFinger [21/22] 人差し指
middleFinger [23/24] 中指
ringFinger [25/26] 薬指
pinkyFinger [27/28] 小指
combinedTrigger [3/3] 組み合わされたトリガー 組み合わされたトリガー 組み合わされたトリガー 組み合わされたトリガー 組み合わされたトリガー 組み合わされたトリガー
primaryButton ボタン [2/0] [X/A] アプリケーション 主要 主要 主要 [Y/B] メニュー
primaryTouch ボタン [12/10] [X/A] - タッチ
secondaryButton ボタン [3/1] [Y/B] 代替 代替 [B/A]
secondaryTouch ボタン [13/11] [Y/B] - タッチ
gripButton ボタン [4/5] グリップ - 押す グリップ - 押す グリップ - 押す グリップ - 押す グリップ - 押す グリップ - 押す グリップ
triggerButton ボタン [14/15] トリガー - 押す 人差し指 - タッチ トリガー - 押す トリガー - 押す トリガー - 押す トリガー - 押す トリガー - タッチ トリガー - 押す
menuButton ボタン [6/7] メニュー スタート(6)
primary2DAxisClick ボタン [8/9] タッチパッド - クリック サムスティック - クリック タッチパッド - クリック タッチパッド - クリック スティックまたはパッド - 押す スティックまたはパッド - 押す スティックまたはパッド - 押す タッチパッド - クリック
primary2DAxisTouch ボタン [16/17] タッチパッド - タッチ サムスティック - タッチ タッチパッド - タッチ タッチパッド - タッチ スティックまたはパッド - タッチ スティックまたはパッド - タッチ スティックまたはパッド - タッチ タッチパッド - タッチ
thumbrest ボタン [18/19] ジョイスティック - クリック サムレスト - タッチ

InputFeatureUsage の定義は XR.Input.CommonUsages を参照してください。

入力デバイスへのアクセス

An InputDevice represents any physical device, such as a controller, cellular phone, or headset, and can contain information on device tracking, buttons, joysticks, and other input controls. See InputDevice for more information on the InputDevice API.

Use the XR.InputDevices class to access input devices (such as controllers and trackers) that are currently connected to the XR system. Use InputDevices.GetDevices to get a list of all connected devices:

var inputDevices = new List<UnityEngine.XR.InputDevice>();
UnityEngine.XR.InputDevices.GetDevices(inputDevices);

foreach (var device in inputDevices)
{
    Debug.Log(string.Format("Device found with name '{0}' and role '{1}'", device.name, device.role.ToString()));
}

An InputDevice remains valid across frames until the XR system disconnects it. Use the InputDevice.IsValid property to determine whether an InputDevice still represents an active controller.

ロールによる入力デバイスへのアクセス

デバイスロールは、入力デバイスの一般的な機能を説明します。 InputDeviceRole列挙型を使用してデバイスの役割を指定します。定義されているロールは以下のとおりです。

  • GameController: a console-style game controller.
  • Generic: A device that represents the core XR device, such as a head-mounted display or mobile device.
  • HardwareTracker: a tracking device.
  • LeftHanded: a device associated with the user’s left hand.
  • RightHanded: a device associated with the user’s right hand.
  • TrackingReference: a device that tracks other devices, such as an Oculus tracking camera.

The underlying XR SDK reports these roles, and different providers can organize their device roles differently. Additionally, a user can switch hands, so the role assignment might not match the hand the user is using to hold the input device. For example, a user must set up the Daydream controller as right or left-handed, but can choose to hold the controller in the opposite hand.

GetDevicesWithRoleは、特定の InputDeviceRole 持つデバイスのリストを提供します。例えば、 InputDeviceRole.GameController を使用して、接続されている GameController デバイスを取得できます。

var gameControllers = new List<UnityEngine.XR.InputDevice>();
UnityEngine.XR.InputDevices.GetDevicesWithRole(UnityEngine.XR.InputDeviceRole.GameController, gameControllers);

foreach (var device in gameControllers)
{
    Debug.Log(string.Format("Device name '{0}' has role '{1}'", device.name, device.role.ToString()));
}

XRノードによる入力デバイスへのアクセス

XR nodes represent the physical points of reference in the XR system. The user’s head position, their right and left hands, and a tracking reference such as an Oculus camera are all XR nodes. The XRNode enumeration defines the available nodes. The defined nodes are:

  • CenterEye: a point midway between the pupils of the user’s eyes.

  • GameController: a console-style game controller. Multiple game controller nodes can exist.

  • HardwareTracker: a hardware tracking device, typically attached to the user or a physical item. Multiple hardware tracker nodes can exist.

  • Head: the center point of the user’s head, as calculated by the XR system.

  • LeftEye: the user’s left eye.

  • LeftHand: the user’s left hand.

  • RightEye: the user’s right eye.

  • RightHand: the user’s right hand.

  • TrackingReference: a tracking reference point, such as the Oculus camera. Multiple tracking reference nodes can exist.

InputDevices.GetDevicesAtXRNodeを使用して、特定の XRNode に関連付けられたデバイスのリストを取得します。以下の例は、左利きのコントローラーを取得する方法を示しています。

var leftHandDevices = new List<UnityEngine.XR.InputDevice>();
UnityEngine.XR.InputDevices.GetDevicesAtXRNode(UnityEngine.XR.XRNode.LeftHand, leftHandDevices);

if(leftHandDevices.Count == 1)
{
    UnityEngine.XR.InputDevice device = leftHandDevices[0];
    Debug.Log(string.Format("Device name '{0}' with role '{1}'", device.name, device.role.ToString()));
}
else if(leftHandDevices.Count > 1)
{
    Debug.Log("Found more than one left hand!");
}

入力 デバイスの 入力機能にアクセスする

You can read an input feature, such as the state of a trigger button, from a specific InputDevice. To read the state of the right trigger, first get an instance of the right-handed device using InputDeviceRole.RightHanded or XRNode.RightHand. Once you have the correct device, use the InputDevice.TryGetFeatureValue function to access the current state.

TryGetFeatureValue() attempts to access the current value of a feature, but can fail if the current device does not support the specified feature or the device is invalid (a controller is no longer active). The function returns true if it successfully retrieves the specified feature value, and returns false if it fails.

To get a particular button, touch input, or joystick axis value, use the CommonUsages class. CommonUsages includes each InputFeatureUsage in the XR input mapping table, as well as tracking features like position and rotation. The following code example uses CommonUsages.triggerButton to detect whether the player is currently pulling the trigger button on a particular InputDevice instance:

bool triggerValue;
if (device.TryGetFeatureValue(UnityEngine.XR.CommonUsages.triggerButton, out triggerValue) && triggerValue)
{
    Debug.Log("Trigger button is pressed");
}

You can also use the InputDevice.TryGetFeatureUsages function to get a list of every InputFeatureUsage that a device provides. This function returns a list of InputFeatureUsage items, which have a type and name property describing the feature. The following example enumerates all of the Boolean features provided by a given input device:

var inputFeatures = new List<UnityEngine.XR.InputFeatureUsage>();
if (device.TryGetFeatureUsages(inputFeatures))
{
    foreach (var feature in inputFeatures)
    {
        if (feature.type == typeof(bool))
        {
            bool featureValue;
            if (device.TryGetFeatureValue(feature.As<bool>(), out featureValue))
            {
                Debug.Log(string.Format("Bool feature '{0}''s value is '{1}'", feature.name, featureValue.ToString()));
            }
        }
    }
}

primaryButtonの例

Different controller configurations can provide access to different features; such as multiple controllers on one system, different controllers on different systems, or different buttons on the same controllers with different SDKs. This diversity makes it more complicated to support input from a range of XR systems. The Unity InputFeatureUsage API helps you get input in an XR platform-agnostic way.

The following example accesses the InputFeatureUsage called primaryButton, no matter which controller or input device provides it. The example includes a class that scans the available devices for the primaryButton. The class monitors the value of the feature on any connected device and if the value changes, the class dispatches a UnityEvent.

To use this class, add it as a component to any GameObject in the Scene.

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Events;
using UnityEngine.XR;

[System.Serializable]
public class PrimaryButtonEvent : UnityEvent<bool>{}

public class PrimaryButtonWatcher : MonoBehaviour
{
    public PrimaryButtonEvent primaryButtonPress;
    
    private bool lastButtonState = false;
    private List<UnityEngine.XR.InputDevice> allDevices;
    private List<UnityEngine.XR.InputDevice> devicesWithPrimaryButton;
    
    void Start()
    {
        if(primaryButtonPress == null)
        {
            primaryButtonPress = new PrimaryButtonEvent();
        }
        allDevices = new List<UnityEngine.XR.InputDevice>();
        devicesWithPrimaryButton = new List<UnityEngine.XR.InputDevice>();
        InputTracking.nodeAdded += InputTracking_nodeAdded;
    }

    // check for new input devices when new XRNode is added
    private void InputTracking_nodeAdded(XRNodeState obj)
    {
        updateInputDevices();
    }

    void Update()
    {
        bool tempState = false;
        bool invalidDeviceFound = false;
        foreach(var device in devicesWithPrimaryButton)
        {
            bool primaryButtonState = false;
            tempState = device.isValid // the device is still valid
                        && device.TryGetFeatureValue(CommonUsages.primaryButton, out primaryButtonState) // did get a value
                        && primaryButtonState // the value we got
                        || tempState; // cumulative result from other controllers
            if (!device.isValid)
                invalidDeviceFound = true;
        }

        if (tempState != lastButtonState) // Button state changed since last frame
        {
            primaryButtonPress.Invoke(tempState);
            lastButtonState = tempState;
        }

        if (invalidDeviceFound || devicesWithPrimaryButton.Count == 0) // refresh device lists
            updateInputDevices();
    }

    // find any devices supporting the desired feature usage
    void updateInputDevices()
    {
        devicesWithPrimaryButton.Clear();
        UnityEngine.XR.InputDevices.GetDevices(allDevices);
        bool discardedValue;
        foreach (var device in allDevices)
        {
            if(device.TryGetFeatureValue(CommonUsages.primaryButton, out discardedValue))
            {
                devicesWithPrimaryButton.Add(device); // Add any devices that have a primary button.
            }
        }
    }
}

The following PrimaryReactor class uses the PrimaryButtonWatcher to detect when you press a primary button and, in response to a press, rotates its parent GameObject. To use this class, add it to a visible GameObject, such as a Cube, and drag the PrimaryButtonWatcher reference to the watcher property.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class PrimaryReactor : MonoBehaviour
{
    public PrimaryButtonWatcher watcher;
    public bool IsPressed = false; // used to display button state in the Unity Inspector window
    public Vector3 rotationAngle = new Vector3(45, 45, 45);
    public float rotationDuration = 0.25f; // seconds
    private Quaternion offRotation;
    private Quaternion onRotation;
    private Coroutine rotator;

    void Start()
    {
        watcher.primaryButtonPress.AddListener(onPrimaryButtonEvent);
        offRotation = this.transform.rotation;
        onRotation = Quaternion.Euler(rotationAngle) * offRotation;
    }

    public void onPrimaryButtonEvent(bool pressed)
    {
        IsPressed = pressed;
        if (rotator != null)
            StopCoroutine(rotator);
        if (pressed)
            rotator = StartCoroutine(AnimateRotation(this.transform.rotation, onRotation));
        else
            rotator = StartCoroutine(AnimateRotation(this.transform.rotation, offRotation));
    }

    private IEnumerator AnimateRotation(Quaternion fromRotation, Quaternion toRotation)
    {
        float t = 0;
        while(t < rotationDuration)
        {
            transform.rotation = Quaternion.Lerp(fromRotation, toRotation, t/rotationDuration);
            t += Time.deltaTime;
            yield return null;
        }
    }
}

古い入力システムによる XR 入力

You can poll XR input features via the legacy input system by using the appropriate legacy input indices from the XR input mappings table. Create an axis mapping in Edit > Settings > Input to add the appropriate mapping from input name to axis index for the platform device’s feature. To retrieve the button or axis value, use Input.GetAxis or Input.GetButton and pass in the now-mapped axis or button name.

For more information about how to use the button and joystick axes, see the documentation on Conventional game input.

ハプティクス

You can send haptic events to an InputDevice. Unity supports haptics as either a simple impulse with amplitude and duration, or as a buffer of data.

すべてのプラットフォームがすべてのタイプのハプティクスに対応しているわけではありませんが、デバイスにハプティックの能力を照会することができます。以下の例では、右手用の入力デバイスを使用し、デバイスがハプティクスに対応できるかどうかを確認します。次に、それが可能な場合は、刺激を再生します。

List<UnityEngine.XR.InputDevice> devices = new List<UnityEngine.XR.InputDevice>(); 

UnityEngine.XR.InputDevices.GetDevicesWithRole(UnityEngine.XR.InputDeviceRole.RightHanded, devices);

foreach (var device in devices)
{
    UnityEngine.XR.HapticCapabilities capabilities;
    if (device.TryGetHapticCapabilities(out capabilities))
    {
            if (capabilities.supportsImpulse)
            {
                uint channel = 0;
                float amplitude = 0.5f;
                float duration = 1.0f;
                device.SendHapticImpulse(channel, amplitude, duration);
            }
    }
}

  • New functionality and behaviour in 2019.1 NewIn20191
空間マッピングのよくあるトラブルシューティングの問題
Windows Mixed Reality の入力
Copyright © 2023 Unity Technologies
优美缔软件(上海)有限公司 版权所有
"Unity"、Unity 徽标及其他 Unity 商标是 Unity Technologies 或其附属机构在美国及其他地区的商标或注册商标。其他名称或品牌是其各自所有者的商标。
公安部备案号:
31010902002961