Objects
Web Audio objects that provide methods, events, and properties for high performance audio. These APIs conform to the emerging W3C Web Audio standard.
In this section
| Topic | Description |
|---|---|
|
Provides real-time frequency and time-domain analysis information for music visualization and analysis. | |
|
A memory-resident audio asset used for one shot sounds or short audio clips. | |
|
An in-memory audio source in AudioBuffer used for short segments that require highly accurate scheduling. | |
|
Provides the basis for using Web Audio, and provides objects, events, methods, and properties to create, modify, and measure audio content. | |
|
Represents the final destination for audio input. Typically this is the audio hardware and is set with the AudioContext destination attribute . | |
|
Represents the setPosition and setOrientation of a person listening to the audio. | |
|
The basic building block of AudioContext representing audio sources, audio destination, and intermediate processing modules. | |
|
Controls an individual parameter on an AudioNode, such as timing for fades or sweeps, or volume levels. | |
|
An Event object type that's dispatched to ScriptProcessorNodes nodes during an audioprocessing event. | |
|
An AudioNode processor that implements common low order filters used for effects such as tone controls, graphic equalizers, and phasers. | |
|
An AudioNode used to combine or merge individual channels into a single audio stream. | |
|
An AudioNode used to access individual channels of an audio stream. | |
|
Used to create real-time linear effects such as the reverb of a concert hall or arena using sampled impulses. | |
|
Asynchronous callback function to signal an error with decoding audio data on an AudioContext. | |
|
Asynchronous callback function to signal successfully decoding audio data with an AudioContext. | |
|
Represents a variable delay line node. | |
|
Represents dynamics compression to lower the volume of loud passages, and increase volume of soft parts producing a louder and fuller sound. It also is used to help avoid clipping of the audio signal. | |
|
Used to control the volume of an audio signal. | |
|
An audio destination that represents a MediaStream with a single AudioMediaStreamTrack. | |
|
An AudioNode source created from a MediaStream, such as a live audio feed or a laptop microphone. | |
|
Event object passed to an OfflineAudioContext when an oncomplete event fires. | |
|
A type of AudioContext that does not render to the audio hardware, but to an AudioBuffer for faster rendering or mix-down. | |
|
An audio source that generates a periodic waveform. | |
|
An AudioNode used to spatialize and position audio in 3D space. | |
|
Used to create an arbitrary waveform for use with an OscillatorNode. It is created with createPeriodicWave and assigned with setPeriodicWave | |
|
An AudioNode that processes, analyses, or generates audio using JavaScript. | |
|
An AudioNode that processes an incoming audio stream in a stereo image using a low-cost equal-power panning algorithm. | |
|
Used to apply non-linear waveshape effects to a media stream to create distortion and other effects. |