Enable WebGPU in the Player settings so you can access its modern graphics features.
Unity doesn’t automatically enable WebGPU, unlike WebGL2. This information covers how to enable the appropriate setting in your project so you can use WebGPU features. For information about features and limitations of WebGPU, refer to WebGPU.
To enable WebGPU in Unity:
To add the WebGPU web graphics API to your supported graphics list in your Unity project:
WebGPU is now a supported web graphics API for your project. Continue to Choose whether you want to keep WebGL2 as a fallback option.
Unity prioritizes the first API on the Graphics API list as the default web graphics API for your project. If a browser doesn’t support that API or there’s another issue, Unity tries the next API on the list (if one exists) as a fallback. Therefore, you have the option to keep support of both APIs available, or use WebGPU only.
It’s useful to keep both because WebGPU is still experimental and not all browsers support it, so users who have older browsers can still access your application.
However, there are reasons you might want to remove WebGL2:
To use WebGPU as the priority web graphics API but keep WebGL2 as a fallback:
Your project now supports both WebGPU and WebGL2, but WebGPU is the priority.
To remove WebGL2 from the supported web graphics API list:
Your project now only uses the WebGPU web graphics API.