Objects

Web Audio objects that provide methods, events, and properties for high performance audio. These APIs conform to the emerging W3C Web Audio standard.

In this section

TopicDescription

AnalyserNode

Provides real-time frequency and time-domain analysis information for music visualization and analysis.

AudioBuffer

A memory-resident audio asset used for one shot sounds or short audio clips.

AudioBufferSourceNode

An in-memory audio source in AudioBuffer used for short segments that require highly accurate scheduling.

AudioContext

Provides the basis for using Web Audio, and provides objects, events, methods, and properties to create, modify, and measure audio content.

AudioDestinationNode

Represents the final destination for audio input. Typically this is the audio hardware and is set with the AudioContext destination attribute .

AudioListener

Represents the setPosition and setOrientation of a person listening to the audio.

AudioNode

The basic building block of AudioContext representing audio sources, audio destination, and intermediate processing modules.

AudioParam

Controls an individual parameter on an AudioNode, such as timing for fades or sweeps, or volume levels.

AudioProcessingEvent

An Event object type that's dispatched to ScriptProcessorNodes nodes during an audioprocessing event.

BiquadFilterNode

An AudioNode processor that implements common low order filters used for effects such as tone controls, graphic equalizers, and phasers.

ChannelMergerNode

An AudioNode used to combine or merge individual channels into a single audio stream.

ChannelSplitterNode

An AudioNode used to access individual channels of an audio stream.

ConvolverNode

Used to create real-time linear effects such as the reverb of a concert hall or arena using sampled impulses.

DecodeErrorCallback

Asynchronous callback function to signal an error with decoding audio data on an AudioContext.

DecodeSuccessCallback

Asynchronous callback function to signal successfully decoding audio data with an AudioContext.

DelayNode

Represents a variable delay line node.

DynamicsCompressorNode

Represents dynamics compression to lower the volume of loud passages, and increase volume of soft parts producing a louder and fuller sound. It also is used to help avoid clipping of the audio signal.

GainNode

Used to control the volume of an audio signal.

MediaElementAudioSourceNode

Represents an audio source from an audio or video element.

MediaStreamAudioDestinationNode

An audio destination that represents a MediaStream with a single AudioMediaStreamTrack.

MediaStreamAudioSourceNode

An AudioNode source created from a MediaStream, such as a live audio feed or a laptop microphone.

OfflineAudioCompletionEvent

Event object passed to an OfflineAudioContext when an oncomplete event fires.

OfflineAudioContext

A type of AudioContext that does not render to the audio hardware, but to an AudioBuffer for faster rendering or mix-down.

OscillatorNode

An audio source that generates a periodic waveform.

PannerNode

An AudioNode used to spatialize and position audio in 3D space.

PeriodicWave

Used to create an arbitrary waveform for use with an OscillatorNode. It is created with createPeriodicWave and assigned with setPeriodicWave

ScriptProcessorNode

An AudioNode that processes, analyses, or generates audio using JavaScript.

StereoPannerNode

An AudioNode that processes an incoming audio stream in a stereo image using a low-cost equal-power panning algorithm.

WaveShaperNode

Used to apply non-linear waveshape effects to a media stream to create distortion and other effects.

 

 

 

Show: