5.1 and 7.1 audio
See surround sound.
Also called Dolby Digital. AC-3 is a digital audio encoding format that is standard in digital HDTV and DVD creation. It supports up to eight channels of audio.
See also surround sound.
See IIS Smooth Streaming.
Advanced Television Systems Committee (ATSC)
A government-sponsored organization that developed and established the standards for digital television broadcast. These standards specify that high-definition television is required to support widescreen, high-resolution images, both in progressive scan and interlaced mode, and also support theater-quality audio playback. The ATSC standard also defines a wider color spectrum than NTSC or PAL standards.
See video aspect ratio and pixel aspect ratio.
See Advanced Television Systems Committee (ATSC).
A high-definition camcorder format that compresses data using H.264 and AC-3 codecs. AVCHD files use the .mts and .m2ts file extensions.
Audio Video Interleaved (AVI)
A multimedia file format for storing sound and moving pictures.
See Audio Video Interleaved (AVI).
A computer network's capacity for transferring an amount of data in a given time.
In MPEG and WMV encoding , a b-frame is a frame that contains only the differences from the frame preceding it and the frame following it. Also called a bidirectional predictive frame.
See also i-frame and p-frame.
The number of bits transferred per second.
bob (deinterlacing method)
A method of deinterlacing that is also known as field interpolation. This method works by taking a set of interlaced video fields and, after analyzing the content in those available fields, creating the additional data necessary to convert the interlaced fields into a single progressive frame.
Text that accompanies images or videos, either as a supplemental description or as a transcript of spoken words.
See constant bit rate (CBR).
An abbreviation for compressor/decompressor. Software or hardware used to compress and decompress digital media.
Compression can be lossy or lossless. Lossy compression is a process for removing redundant data from a digital media file or stream to reduce its size or the bandwidth used. You cannot return a file compressed using lossy compression to its original quality. Lossless compression is a process that reduces the file size of the data, but allows the file to retain the original quality in its compressed form.
See also lossy compression.
constant bit rate (CBR)
A characteristic of a data stream in which the bit rate remains nearly uniform for the duration of the stream.
Audio, video, images, text, or any other information that is contained in a digital media file or stream.
The file name extension of the Microsoft Digital Video Recording file format. Media encoded with this format uses MPEG-2 encoding for video and MPEG-1 Layer II or Dolby Digital AC-3 for audio. This file format is the main Media Center format and is also used by Windows Movie Maker for capturing HDV.
The act of decompressing a compressed digital file. Decoding is handled by a codec that can interpret the compression used on the digital file.
To combine the interlaced fields in a video frame so that, during playback, the lines of the video frame are painted sequentially.
See also interlace.
A video frame that contains only the changes from the previous frame. In contrast, a key frame contains all the data necessary to construct that frame.
To convert audio and video content to a specified digital format.
The number of video frames displayed per second. Higher frame rates generally produce smoother movement in the picture.
See group of pictures (GOP).
group of pictures (GOP)
In MPEG or WMV files, a group of pictures is a set of frames, usually 15 frames long for MPEG files and a variable length for WMV files, that begins and ends with an i-frame and that contains p-frames or b-frames between the i-frames.
See also b-frame, i-frame, and p-frame.
A video encoding and compression standard intended to produce good-quality video at lower bit rates, though it is also used to compress video at good quality for a variety of applications, from personal video devices to broadcast and network systems. This standard is also referred to as MPEG-4.
See high-definition television (HDTV).
high-definition television (HDTV)
The highest quality of the digital television broadcast formats. HDTV adheres to the following standards, as prescribed by the Advanced Television Standards Committee (ATSC):
An aspect ratio of 16:9.
Playback resolutions of 1280x720, or 1920x1080.
Interlaced scan field rates of 50 (PAL), 59.94, and 60 fields per second, and progressive scan frame rates of 24, 25 (PAL), 29.97, and 30 frames per second.
Theater-quality, AC-3 or PCM audio.
Also known as an intraframe. In MPEG and WMV compression, i-frames are compressed without any dependence on data in previous or upcoming frames. The compression algorithm uses only the information in the frame itself as a reference for compression.
See also b-frame and p-frame.
IIS (Internet Information Services) Smooth Streaming
The Microsoft implementation of adaptive streaming, which is a form of web-based content delivery that uses HTTP progressive download technology. Instead of content being delivered by downloading, or in a CBR (constant bit rate) stream, the content is delivered to the client in a series of fragments. Should playback conditions deteriorate, the client can request fragments from the server that are encoded at lower bit rates. Conversely, should playback conditions improve, the client can request playback fragments encoded at higher bit rates. Content providers must have Microsoft Windows Server 2008, Microsoft Internet Information Services (IIS), the Smooth Streaming extension for IIS7, and Silverlight to deliver the full Smooth Streaming experience.
A feature of Expression Encoder used when encoding media. Expression Encoder will re-encode a particular value, such as video size or audio sample rate, only if you have changed the value.
Also called interlaced scan. Interlacing means to display a video frame in two alternate fields. One field contains the even lines of the frame, the other field contains the odd lines. During playback, the lines in one field are displayed first, then the lines in the second field are displayed. Interlaced display is the standard display method for DVDs and broadcast television. However, some televisions employ progressive scan display.
See also progressive scan.
The process that removes the frames that were added when 24-fps film was converted to 30-fps video.
See also telecine.
A video frame containing all the data needed to construct an image without reference to previous frames.
See also delta frame.
A video file that is appended to the beginning of a main video, such as an introduction containing an animated company logo.
A feature of Expression Encoder that you can use to produce and broadcast a live, encoded multimedia production.
To repeat a stream continuously.
A process for compressing data in which information deemed unnecessary is removed and cannot be recovered upon decompression. Typically used with audio and visual data in which a slight degradation of quality is acceptable.
See also compression.
The file name extension of QuickTime video files.
A text string that is associated with a designated time in Windows Media–based content. Markers often denote convenient points to begin playback, such as the start of a new scene.
An application that splits the audio and video from a single file into separate files. Most media splitters can split both AVI and MPEG files.
Data that is embedded in a file that lists specific information about the file such as title, subject, author, and file size.
A content delivery method in which a single stream is transmitted from a media server to multiple clients. The clients have no connection with the server. Instead, the server sends a single copy of the stream across the network to multicast-enabled routers, which replicate the data. Clients can then receive the stream by monitoring a specific multicast IP address and port.
multiple bit rate (MBR) encoding
An encoding method supported by Expression Encoder that you can use to encode content using several different bit rates. When MBR-encoded content is played back, your audience will be delivered the content using the bit rate that best matches their playback scenario.
See two-pass encoding.
National Television Standards Committee (NTSC)
The dominant television standard in the United States and Japan. NTSC delivers 30 interlaced frames per second at 525 lines of resolution.
Pixels that generally have an aspect ratio of 3:4, 32:27, or 8:9, meaning that they are slightly less tall than they are wide, or vice versa. Non-square pixels are standard in digital video. Often referred to as rectangular pixels.
See National Television Standards Committee.
An encoding method in which content is both analyzed and compressed in the same pass through the encoder.
See also two-pass encoding.
A video or audio track that plays back at the same time as another video or audio track.
See Phase Alternating Line (PAL).
See pulse code modulation (PCM).
Also known as a predictive frame. In MPEG and WMV compression, p-frames are composed of data from previous frames.
See also b-frame and i-frame.
A formatted collection of data sent across a network. Packets are one of the methods of online content delivery. Streaming is another method, better suited for uninterrupted audio and video transmission.
See also stream.
Phase Alternating Line (PAL)
The dominant television standard in Europe and China. PAL delivers 25 interlaced frames per second at 625 lines of resolution.
pixel aspect ratio
The ratio of a pixel's width to its height. Computer monitor pixels are square, and therefore have a pixel aspect ratio of 1:1.
See also video aspect ratio and nonsquare pixels.
The size and arrangement of pixel color components. The format is specified by the total number of bits used per pixel and the number of bits used to store the red, green, blue, and alpha components of the color of the pixel.
In Expression Encoder, a set of preconfigured encoding settings, customized for a number of playback scenarios, that you can quickly apply to your content.
A group of settings that match content type and bit rate with appropriate audio and video codecs.
program stream format
A file format that supports the multiplexing of audio, video, and data. File formats such as WMV, AVI, MPEG-1 and MPEG-2 are examples of file formats that support program streams. Program streams use variable-sized packets that have a common time base, and they are designed for reliable media, such as hard drives and DVDs.
See also stream format and transport stream format.
The method of displaying a video image one full frame at a time by scanning each line of an image in sequential order. Videos displayed using progressive scan contain fewer motion artifacts and are generally smoother and more stable than interlaced scan display. Progressive scan is a feature of computer monitors, many DVDs, and most HDTVs.
See also interlace.
An organized memory location that translates a client request for content into the physical path on the server hosting the content. A publishing point essentially acts as a redirector.
pulse code modulation (PCM)
A technique for digitizing audio into an uncompressed format by assigning a value to the amplitude of the signal at fixed intervals.
See non-square pixels.
A color model that describes color information in terms of the red (R), green (G), and blue (B) intensities that make up the color.
The frequency of sampling. The higher the sampling rate (that is, the more samples taken per unit of time), the more closely the digitized result resembles the original.
Named data that is associated with a designated time in Windows Media–based content. The data can be used by players to perform a specific action such as displaying a web page.
Microsoft Silverlight is a cross-browser technology that supports a true subset of the XML-based XAML (Extensible Application Markup Language) and that enables you to create rich web client experiences. It delivers a lightweight client that supports vector graphics, 2D animation, rich audio and video integration, and a rich Microsoft .NET Framework programming model.
See one-pass encoding.
See IIS Smooth Streaming.
Audio and video content that can be captured and encoded from devices installed on your computer or from a file.
Digital media that is in the process of being delivered in a continuous flow across a network.
Information about the properties of a stream, such as the codecs used, frame rate, and frame size. A player uses stream format information to decode a stream.
See also program stream format and transport stream format.
Sound formats that feature multiple speakers intended to give the effect of being in a three-dimensional sound environment. Surround sound is generally presented in 5.1 and 7.1 formats. The 5.1 format contains audio information for two front speakers, two rear speakers, a center speaker, and an additional channel called a low frequency effects (LFE) channel, which can accommodate sub bass signals, or any other sort of cinematic effect. This format is also referred to as Dolby AC-3. The 7.1 format adds a left and right surround channel to the 5.1 channel format. The extra surround speakers are generally centered between the respective front and rear speakers.
.ts (transport stream format)
See transport stream format.
The film-to-video conversion system that adds frames to video to compensate for the differences in frame rates between film and video.
See also inverse telecine.
A video file that is appended to the ending of the main video, such as a segment consisting of outtakes or credits.
transport stream format
An industry-standard container format used by video broadcasters and HDV camcorders to stream video data. Transport streams use fixed-size packets with their own time bases and are primarily designed for use in environments where packet loss is more likely.
See also program stream format and stream format.
An encoding method in which content is analyzed in one pass through the encoder, after which compression is applied in the second pass.
See also one-pass encoding.
See User Datagram Protocol (UDP).
See Universal Naming Convention (UNC).
A method used by media servers for providing content to connected clients in which each client receives a discrete stream. No other client has access to that stream.
Universal Naming Convention (UNC)
The full name of a resource on a network. It conforms to the \\servername\sharename syntax, where servername is the name of the server and sharename is the name of the shared resource. UNC names of directories or files can also include the directory path under the share name, with the following syntax: \\servername\sharename\directory\filename.
User Datagram Protocol (UDP)
A connectionless transport protocol in the TCP/IP protocol stack that is used in cases where some packet loss is acceptable, such as with digital media streams.
variable bit rate (VBR)
A characteristic of a data stream in which the bit rate fluctuates, depending on the complexity of the data.
See variable bit rate (VBR).
VC-1 is a high-quality, industry-standard codec based on Windows Media Video Version 9. VC-1 is optimized for encoding video intended for professional broadcast and video streaming and offers compression quality greater than MPEG-2.
video aspect ratio
The ratio of video width to video height. The common television aspect ratio is 4:3 and is also known as fullscreen. The high-definition television (HDTV) aspect ratio is 16:9, and is also known as widescreen.
See also pixel aspect ratio.
Windows Media Audio codec
A codec used to compress and decompress audio streams.
Windows Media file
A file containing audio, video, or script data that is stored in Windows Media format. Depending on their content and purpose, Windows Media files use a variety of file name extensions, such as .wma, .wme, .wms, .wmv, .wmx, .wmz, or .wvx.
Windows Media Format
The format used by Windows Media technologies (or a non-Microsoft product that incorporates a licensed Windows Media technology) to author, store, edit, distribute, stream, or play timeline-based content.
Windows Media Screen codec
A codec used to compress and decompress sequences of screen images.
Windows Media Video (WMV) codec
A codec used to compress and decompress video streams.
The file name extension of an Expression Encoder job file, which is an XML file that contains all the settings, or presets, that you applied to each media file in the job, and also pointers to those original media files.
The file name extension of an Expression Encoder Live Encoding file, which is an XML file containing all the settings, or presets, that you applied to each file source, and also pointers to those original source files.
The file name extension of an Expression Encoder screen capture.
XAML (Extensible Application Markup Language)
An XML-based markup language for declarative application programming that is used in .NET Framework 3.0, and later, technologies, such as Silverlight. Expression Encoder can import graphic files created using XAML.
A color model that describes color information in terms of its brightness (luminance, or Y), and color (chrominance, or U and V).