Legacy Documentation: Version 2018.1 (Go to current version)
LanguageEnglish
  • C#
  • JS

Script language

Select your preferred scripting language. All code snippets will be displayed in this language.

MonoBehaviour.OnAudioFilterRead(float[], int)

Suggest a change

Success!

Thank you for helping us improve the quality of Unity Documentation. Although we cannot accept all submissions, we do read each suggested change from our users and will make updates where applicable.

Close

Submission failed

For some reason your suggested change could not be submitted. Please <a>try again</a> in a few minutes. And thank you for taking the time to help us improve the quality of Unity Documentation.

Close

Cancel

Parameters

dataAn array of floats comprising the audio data.
channelsAn int that stores the number of channels of audio data passed to this delegate.

Description

If OnAudioFilterRead is implemented, Unity will insert a custom filter into the audio DSP chain.

The filter is inserted in the same order as the MonoBehaviour script is shown in the inspector.

OnAudioFilterRead is called every time a chunk of audio is sent to the filter (this happens frequently, every ~20ms depending on the sample rate and platform). The audio data is an array of floats ranging from [-1.0f;1.0f] and contains audio from the previous filter in the chain or the AudioClip on the AudioSource. If this is the first filter in the chain and a clip isn't attached to the audio source, this filter will be played as the audio source. In this way you can use the filter as the audio clip, procedurally generating audio.

If there is more than one channel, the channel data is interleaved. This means each consecutive data sample in the array comes from a different channel until you run out of channels and loop back to the first. data.Length reports the total size of the data, so to find the number of samples per channel, divide data.Length by channels.

If OnAudioFilterRead is implemented a VU meter is shown in the inspector displaying the outgoing sample level. The process time of the filter is also measured and the spent milliseconds are shown next to the VU meter. The number turns red if the filter is taking up too much time, meaning the mixer will be starved of audio data.

Also note that OnAudioFilterRead is called on a different thread from the main thread (namely the audio thread) so calling into many Unity functions from this function is not allowed (if you try, a warning shows up at run time).

See Also: Audio Filters.

#pragma strict
// The code example shows how to implement a metronome that procedurally
// generates the click sounds via the OnAudioFilterRead callback.
// While the game is paused or suspended, this time will not be updated and sounds
// playing will be paused. Therefore developers of music scheduling routines do not have
// to do any rescheduling after the app is unpaused
@RequireComponent(AudioSource)
public class AudioTest extends MonoBehaviour {
	public var bpm: double = 140.0F;
	public var gain: float = 0.5F;
	public var signatureHi: int = 4;
	public var signatureLo: int = 4;
	private var nextTick: double = 0.0F;
	private var amp: float = 0.0F;
	private var phase: float = 0.0F;
	private var sampleRate: double = 0.0F;
	private var accent: int;
	private var running: boolean = false;
	function Start() {
		accent = signatureHi;
		var startTick: double = AudioSettings.dspTime;
		sampleRate = AudioSettings.outputSampleRate;
		nextTick = startTick * sampleRate;
		running = true;
	}
	function OnAudioFilterRead(data: float[], channels: int) {
		if (!running)return ;
		var samplesPerTick: double = sampleRate * 60.0F / bpm * 4.0F / signatureLo;
		var sample: double = AudioSettings.dspTime * sampleRate;
		var dataLen: int = data.Length / channels;
		var n: int = 0;
		while ( n < dataLen ) {
			var x: float = gain * amp * Mathf.Sin(phase);
			var i: int = 0;
			while ( i < channels ) {
				data[n * channels + i] += x;
				i++;
			}
			while ( sample + n >= nextTick ) {
				nextTick += samplesPerTick;
				amp = 1.0F;
				if (++accent > signatureHi) {
					accent = 1;
					amp *= 2.0F;
				}
				Debug.Log("Tick: " + accent + "/" + signatureHi);
			}
			phase += amp * 0.3F;
			amp *= 0.993F;
			n++;
		}
	}
}
using UnityEngine;

// The code example shows how to implement a metronome that procedurally // generates the click sounds via the OnAudioFilterRead callback. // While the game is paused or suspended, this time will not be updated and sounds // playing will be paused. Therefore developers of music scheduling routines do not have // to do any rescheduling after the app is unpaused

[RequireComponent(typeof(AudioSource))] public class AudioTest : MonoBehaviour { public double bpm = 140.0F; public float gain = 0.5F; public int signatureHi = 4; public int signatureLo = 4;

private double nextTick = 0.0F; private float amp = 0.0F; private float phase = 0.0F; private double sampleRate = 0.0F; private int accent; private bool running = false;

void Start() { accent = signatureHi; double startTick = AudioSettings.dspTime; sampleRate = AudioSettings.outputSampleRate; nextTick = startTick * sampleRate; running = true; }

void OnAudioFilterRead(float[] data, int channels) { if (!running) return;

double samplesPerTick = sampleRate * 60.0F / bpm * 4.0F / signatureLo; double sample = AudioSettings.dspTime * sampleRate; int dataLen = data.Length / channels;

int n = 0; while (n < dataLen) { float x = gain * amp * Mathf.Sin(phase); int i = 0; while (i < channels) { data[n * channels + i] += x; i++; } while (sample + n >= nextTick) { nextTick += samplesPerTick; amp = 1.0F; if (++accent > signatureHi) { accent = 1; amp *= 2.0F; } Debug.Log("Tick: " + accent + "/" + signatureHi); } phase += amp * 0.3F; amp *= 0.993F; n++; } } }
对文档有任何疑问,请移步至开发者社区提问,我们将尽快为您解答