Version: 2020.2
3D textures
Cubemaps

Texture arrays

A texture array is a collection of same size/format/flags 2D textures that look like a single object to the GPU, and can be sampled in the shader with a texture element index. They are useful for implementing custom terrain rendering systems or other special effects where you need an efficient way of accessing many textures of the same size and format. Elements of a 2D texture array are also known as slices, or layers.

Platform Support

Texture arrays need to be supported by the underlying graphics API and the GPU. They are available on:

  • Direct3D 11/12 (Windows, Xbox One)
  • OpenGL Core (Mac OS X, Linux)
  • Metal (iOS, Mac OS X)
  • OpenGL ES 3.0 (Android, WebGL 2.0)
  • PlayStation 4

Other platforms do not support texture arrays (OpenGL ES 2.0 or WebGL 1.0). Use SystemInfo.supports2DArrayTextures to determine texture array support at runtime.

Importing texture arrays

You can import texture arrays from source texture files that are divided into cells. These are called flipbook textures. To do this:

  1. Import the source texture into your Unity Project.
  2. In your Project view, select the resulting Texture Asset. Unity displays the Texture import settings in the Inspector.
  3. In the Inspector, set Texture Shape to 2D Array. Unity displays the Columns and Rows properties.
  4. Set Columns and Rows to the appropriate values for your flipbook texture.
  5. Click Apply.

For more information, see Texture import settings.

Creating and manipulating texture arrays using scripts

To create a texture array from a C# script, use the Texture2DArray class to initialize the texture and set pixel data, and save the object as an asset file using AssetDatabase.CreateAsset.

Normally, texture arrays are used purely within GPU memory, but you can use Graphics.CopyTexture, Texture2DArray.GetPixels and Texture2DArray.SetPixels to transfer pixels to and from system memory.

Using texture arrays as render targets

Texture array elements may also be used as render targets. Use RenderTexture.dimension to specify in advance whether the render target is to be a 2D texture array. The depthSlice argument to Graphics.SetRenderTarget specifies which mipmap level or cube map face to render to. On platforms that support “layered rendering” (i.e. geometry shaders), you can set the depthSlice argument to –1 to set the whole texture array as a render target. You can also use a geometry shader to render into individual elements.

Using texture arrays in shaders

See Using texture arrays in shaders.

3D textures
Cubemaps
Copyright © 2023 Unity Technologies
优美缔软件(上海)有限公司 版权所有
"Unity"、Unity 徽标及其他 Unity 商标是 Unity Technologies 或其附属机构在美国及其他地区的商标或注册商标。其他名称或品牌是其各自所有者的商标。
公安部备案号:
31010902002961