Device.Streaming Requirements

Device.Streaming.HMFT

Hardware Media Foundation Transform

Related Requirements
Device.Streaming.HMFT.Decoding
Device.Streaming.HMFT.Encoding

Device.Streaming.HMFT.Decoding

Hardware Media Foundation Transform (HMFT) supports video decoding.

Target Feature
Device.Streaming.HMFT
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Supported FormatsHMFT video decoder is only supported for MPEG-4 Part 2 and MJPEG.Media Foundation Compliance The video decoder Hardware Media Foundation Transform (HMFT) must fully comply with the following Media Foundation Transform (MFT) interfaces:

IMFTransform

IMFMediaEventGenerator

IMFShutdown

IMFQualityAdvise

IMFRealTimeClientEx

The HMFT video decoder must use Media Foundation work queue support (no thread creation) via IMFRealTimeCLientEx::SetWorkQueue. This ensures the following:

Multimedia Class Scheduler Service (MMCSS) support for playback and capture/encode

Improved core scalability

The HMFT video decoder must support IMF2DBuffer2 for enhanced security, but also work with input buffers of type IMF2DBuffer, and IMFMediaBuffer in that order of preference.DirectX RenderingThe video decoder HMFT must support both DirectX(DX)9 and DX11 devices, and it must avoid copies in or out of DX11 components. On MFT_Set_D3D_Manager, the video decoder HMFT must first query for DirectX Graphics Infrastructure (DXGI) Manager and then query for D3D9 Manager if DXGI Manager is not found.The HMFT video decoder must support system memory output because some transforms in the pipeline may support only system memory buffers.If the HMFT video decoder is based on GPU must support DX11.Memory UsageThe HMFT video decoder must be an asynchronous MFT. This reduces the memory working set by about 50 percent.Trusted Merit Certification and VerificationThe video decoder HMFT must support the Trusted Merit Certification and Verification process, as defined across the Windows Hardware Certification Kit. Each HMFT video decoder must be provided as a separate binary and must be individually signed.HMFT video decoders must not advertise support for more than one compression standard.All HMFTs must set the following Media Foundation attributes while registering the MFT with the system:

MFT_ENUM_HARDWARE_URL_Attribute

MFT_ENUM_HARDWARE_VENDOR_ID_Attribute

Format RequirementsHMFT video decoders must not advertise support for inbox formats supported by DirectX Video Acceleration (DXVA) (H.264, WMV, MPEG2). If implemented, the HMFT video decoder for MPEG-4 Part 2 must support Simple and Advanced Simple Profile (If Global Motion Compensation (GMC) is not supported, then the media type must be rejected to allow the software decoder to be used for playback), and all levels.

The decoder must be fully conformant to specifications that are defined for the format.

The MPEG-4 Part 2 decoder must fully support H.263baseline content and advertise support for this media type.

In addition to the preceding requirements, we recommend that the decoder support post-processing for deblocking and deringing. Vendors may provide other HMFTs video decoders for formats that are not supported inbox, but there are no verification tests or logo certification available. Note: The recommendations and requirements that are defined in this document apply to all formats.FunctionalityThe video decoder HMFT must support the following functionalities:

Dynamic format and resolution changes

Trick modes (playback rate control, thinning mode) and seek

Performance The HMFT video decoder must be able to decode 40 megabits per second (Mbps) at 1080p in real time. Interlace SupportThe HMFT video decoder must support the input format for both interlaced and progressive bit streams. It must not de-interlace. It may support inverse telecine (IVTC).Multiple InstancesThe HMFT video decoder must support multiple instances of the decoder in parallel (both in-process and out of process) to enable multiple concurrent video playback streams in the same or different applications.Design NotesThe HMFT video decoder must be installed and uninstalled through a device driver that meets Windows security requirements. The driver must not cause the operating system to crash or hang, and must disallow memory violations.Each HMFT component must be a separate binary, individually certified and signed.

Additional Information

Business Justification
HMFT is a feature that enables Independent Hardware Vendors (IHVs) to provide to hardware media solutions for non-DXVA supported formats.DX9 support is required for older applications. DX11 support is required for the new features, scenarios and improved performance.
Enforcement Date
Mar. 01, 2012

Device.Streaming.HMFT.Encoding

Hardware Media Foundation Transform (HMFT) supports video encoding

Target Feature
Device.Streaming.HMFT
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

A. H.264 Encode

If your hardware supports H.264 Encode, you must:

A.1 On input:

A.1.1 Support NV12

A.1.2 If your hardware supports it:

A.1.2.1 Support IYUV and YUY2

A.1.3 Support input buffers of (and query in this order):

A.1.3.1 IMF2DBuffer2

A.1.3.2 IMF2DBuffer

A.1.3.3 IMFMediaBuffer

A.2 On output:

A.2.1 Support Baseline profile

A.2.2 Support Constrained Baseline profile

A.2.3 Support Main profile

A.2.4 Support Constrained High profile

A.3 Your H.264 encoder must expose the following interfaces:

A.3.1IMFTransform

A.3.1.1 IMFTransform::ProcessEvent

Must handle the MEEncodingParameters event

A.3.2 ICodecAPI

A.3.2.1 ICodecAPI requires the following functions to be implemented:

A.3.2.1.1 IsSupported

A.3.2.1.2 GetValue

A.3.2.1.3 SetValue

A.3.2.1.4 GetParameterRange

A.3.2.1.5 GetParameterValues

A.3.2.2 ICodecAPI requires the following properties to be supported:

A.3.2.2.1 Peak-constrained VBR mode

A.3.2.2.2 Quality-based VBR mode

A.3.2.2.3 CBR encoding

A.3.2.2.4 GOP distance

Must support parameter range with maximum value IntMAX

A.3.2.2.5 Frame-QP control

Must support at minimum QP range from 20 to 40. Must return error when invalid value (e.g. 52) is set.

A.3.2.2.6 Force-Key-frame control

IDR must be preceded with SPS/PPS. When number of temporal layers > 1, Key frame must be inserted in the next base temporal layer frame in order to preserve temporal structure. For single layer bitstream, key frame must be inserted in the next frame.

A.3.2.2.7 QualityVsSpeed control

A.3.2.2.8 Low-latency encoding

Need to force POC type 2 in low latency mode or VUI bitstream_restriction flas is TRUE and VUI num_reorder_frames is 0

A.3.2.2.9 Temporal-layer encoding (1-3 layers)

A.3.2.2.10 CODECAPI_AVEncVideoMaxNumRefFrame

A.3.2.2.11 CODECAPI_AVEncSliceControlMode

A.3.2.2.12 CODECAPI_AVEncSliceControlSize

A.3.2.2.13 CODECAPI_AVEncVideoMeanAbsoluteDifference

A.3.2.2.14 CODECAPI_AVEncVideoEncodeFrameTypeQP

A.3.2.2.15 CODECAPI_AVEncVideoMaxQP

A.3.2.3 It is recommended that you additionally implement the following functions:

A.3.2.3.1 GetDefaultValue

A.3.2.3.4 IsModifiable

A.3.2.3.5 SetAllDefaults

A.3.2.4 If your H.264 encoder implements Long Term Reference frames, then the following CodecAPI interfaces should be implemented (completed within one frame delay):

A.3.2.4.1 CODECAPI_AVEncVideoLTRBufferControl

A.3.2.4.2 CODECAPI_AVEncVideoMarkLTRFrame

A.3.2.4.3 CODECAPI_AVEncVideoUseLTRFrame

Note: IMFSample attribute LongTermReferenceFrameInfo must be supported. IMFSample is also recommended to return MAD for each frame.

If LTR is supported by the encoder, it must support at minimum 2 LTR frames in both trust modes

A.3.3 IMFAttributes

A.3.3.1 Via IMFTransform::GetAttributes

A.3.3.1.1 The three required attributes are:

A.3.3.1.2 MFT_ENUM_HARDWARE_URL_Attribute

A.3.3.1.3 MFT_ENUM_HARDWARE_VENDOR_ID_Attribute

A.3.3.1.4 MFT_ENCODER_SUPPORTS_CONFIG_EVENT

A.3.4 It is recommended that you implement an asynchronous MFT

A.3.4.1 Asynchronous MFTs require the following additional interfaces:

A.3.4.1.1 IMFMediaEventGenerator

A.3.4.1.2 IMFShutdown

A.3.4.2 Asynchronous MFTs are encouraged to avoid creating threads and are recommended to use the MF Thread Pool

A.3.4.2.1 Registration with MMCS via IMFRealTimeCLientEx::SetWorkQueue is critical to meet performance goals

A.3.5 IMFRealTimeClientEx is recommended

A.4 Encoding settings

A.4.1 Through input media type negotiation, the H.264 Encoder must support:

A.4.1.1 MF_MT_INTERLACE_MODE

A.4.1.1.1 The encoder must preserve interlace from input to output or reject interlace

A.4.1.2 MF_MT_MINIMUM_DISPLAY_APERTURE

A.4.2 Through output media type negotiation, the H.264 Encoder must support:

A.4.2.1 MF_MT_SUBTYPE

A.4.2.2 MF_MT_MINIMUM_DISPLAY_APERTURE

A.4.2.3 MF_MT_FRAME_RATE

A.4.2.4 MF_MT_FRAME_SIZE

A.4.2.5 MF_MT_AVG_BITRATE

A.4.2.6 MF_MT_MPEG2_PROFILE(MF_MT_VIDEO_PROFILE)

A.4.2.7 MF_MT_MPEG2_LEVEL(MF_MT_VIDEO_LEVEL)

A.4.2.8 MF_MT_PIXEL_ASPECT_RATIO

A.4.2.9 MF_MT_H264_MAX_MB_PER_SEC (for App to query encoder capabilities)

A.4.3 It is recommended that your H.264 Encoder supports:

A.4.3.1 B frame encoding

A.5 Multiple Instances

A.5.1 It is required that your H.264 encoder must support a minimum of 3 concurrent instances. Encoder should not put any limitation on number of instances. It should be bounded by the memory usage.

A.5.1.1 These instances may be in the same process or in different processes

A.6 Merit Validation

A.6.1 It is required that your H.264 encoder supports the trusted merit verification process

A.6.2 It is required that your H.264 encoder be a separate binary, individually certified and signed

A.7 Additional Requirements

A.7.1 Your H.264 encoder must work with the Windows MP4 file sink

A.7.2 Your H.264 encoder must implement proper order of encoding configuration

A.7.3 Your H.264 encoder must be capable of producing valid bitstreams with up to 3 temporal layers

A.7.4 Your H.264 encoder prefix NALU prepends each slice and is syntax correct with bitstreams with 1-3 temporal layers

A.7.5 Your H.264 encoder must send one batched GPU request per frame

A.7.6 Your H.264 encoder must insert an AUD NALU before each frame

A.7.7 Your H.264 encoder must work correctly in session 0

A.8 Installation

A.8.1 It is required that your H.264 encoder is registered and unregistered along with the device driver used in the encoder

A.9 Performance

A.9.1 It is required that your H.264 encoder must be capable of real-time encoding 1080p@30fps up to 12Mbps (Level 4.1) is preferred.

A.9.2 VideoEncoderCreationLatency:

System latency in HMFT creation is shorter than 100ms each in 10 runs

A.9.3 VideoEncoderHighGPUUsage

Your H.264 encoder must preserve one-in-one-out behavior in low-latency mode even if GPU is under stress (e.g: simultaneous 720p encode and decode) while bitstream quality is preserved.

A.10 Dynamic Format Change

It is required that your H.264 encoder supports dynamic format changes of the following within one frame latency:

Profile Change

Resolution and Frame Rate Change

Add and Delete Temporal Layers

Maximum allowed number of reference frames

When multiple changes are requested before an output frame is produced, the last change should be honored.

A.11 New MF Attributes

Your H.264 encoder must support the following new MF attributes:

MF_MT_VIDEO_NOMINAL_RANGE

MFSampleExtension_MeanAbsoluteDifference

MFSampleExtension_LongTermReferenceFrameInfo

MFSampleExtension_ROIRectangles

A.12 Quality

Your H.264 encoder must be within 1dB or better than the inbox H.264 encoder where quality is measured over an R-D curve.

B. VC-1 Encode (Required fpr Windows RT)

If your hardware supports VC-1 Encode, you must:

B.1 On input:

B.1.1 Support NV12

B.1.2 If your hardware supports it:

B.1.2.1 Support IYUV and YUY2

B.1.3 Support input buffers of (and query in this order):

B.1.3.1 IMF2DBuffer2

B.1.3.2 IMF2DBuffer

B.1.3.3 IMFMediaBuffer

B.2 On output:

B.2.1 Support Simple profile

B.2.2 Support Main profile

B.2.3 Support Advanced profile

B.3 Your VC-1 encoder must expose the following interfaces:

B.3.1 IMFTransform

B.3.2 ICodecAPI

B.3.2.1 ICodecAPI requires the following functions to be implemented:

B.3.2.1.1 IsSupported

B.3.2.1.2 GetValue

B.3.2.1.3 SetValue

B.3.2.2 ICodecAPI requires the following properties to be supported:

B.3.2.2.1 Peak-constrained VBR mode

B.3.2.2.2 Quality-based VBR mode

B.3.2.2.3 CBR encoding

B.3.2.2.4 GOP distance

B.3.2.2.5 Frame-QP control

B.3.2.2.6 Adaptive bitrate control

B.3.2.2.7 Force-Key-frame control

B.3.2.2.8 QualityVsSpeed control

B.3.2.2.9 Low-latency encoding

B.3.2.3 It is recommended that you additionally implement the following functions:

B.3.2.3.1 GetDefaultValue

B.3.2.3.2 GetParameterRange

B.3.2.3.3 GetParameterValues

B.3.2.3.4 IsModifiable

B.3.2.3.5 SetAllDefaults

B.3.3 IMFAttributes

B.3.3.1 Via IMFTransform::GetAttributes

B.3.3.2 The two required attributes are:

B.3.3.2.1 MFT_ENUM_HARDWARE_URL_Attribute

B.3.3.2.2 MFT_ENUM_HARDWARE_VENDOR_ID_Attribute

B.3.4 It is recommended that you implement an asynchronous MFT

B.3.4.1 Asynchronous MFTs require the following additional interfaces:

B.3.4.1.1 IMFMediaEventGenerator

B.3.4.1.2 IMFShutdown

B.3.4.2 Asynchronous MFTs are encouraged to avoid creating threads and are recommended to use the MF Thread Pool

B.3.4.2.1 Registration with MMCS via IMFRealTimeCLientEx::SetWorkQueue is critical to meet performance goals

B.3.5 IMFRealTimeClientEx is recommended

B.4 Encoding Settings

B.4.1 Through input media type negotiation, the VC-1 Encoder must support:

B.4.1.1 MF_MT_INTERLACE_MODE

B.4.1.1.1 The encoder must preserve interlace from input to output or reject interlace

B.4.1.2 MF_MT_MINIMUM_DISPLAY_APERTURE

B.4.2 Through output media type negotiation, the VC-1 Encoder must support:

B.4.2.1 MF_MT_SUBTYPE

B.4.2.2 MF_MT_FRAME_SIZE

BC.4.2.3 MF_MT_FRAME_RATE

B.4.2.4 MF_MT_AVG_BITRATE

B.4.2.5 MF_MT_PIXEL_ASPECT_RATIO

B.4.3 It is recommended that your VC-1 Encoder supports:

B.4.3.1. B frame encoding

B.5 Multiple Instances

B.5.1 It is required that your VC-1 encoder must support a minimum of 3 concurrent instances. Encoder should not put any limitation on number of instances. It should be bounded by the memory usage.

B.5.1.1 These instances may be in the same process or in different processes

B.6 Merit Validation

B.6.1 It is required that your VC-1 encoder supports the trusted merit verification process

B.6.2 It is required that your VC-1 encoder be a separate binary, individually certified and signed

B.7 Additional Requirements

B.7.1 Your VC-1 encoder must work with the Windows ASF file sink

B.7.2 Your VC-1 encoder must implement proper order of encoding configuration

B.7.3 Your VC-1 encoder must work correctly in session 0

B.8 Installation

B.8.1 It is required that your VC-1 encoder is registered and unregistered along with the device driver used in the encoder

B.9 Performance

B.9.1 It is required that your VC-1 encoder must be capable of real-time encoding 1280x720x30fps up to 7 Mbps on x86/x64 systems

B.9.2 VC1 encoder must be capable of real-time encoding 720x480x30fps up to 5 Mbps on ARM systems

Additional Information

Business Justification
HMFT is a feature introduced in Windows 7 to enable Independent Hardware Vendors (IHVs) to provide hardware media solutions
Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base

Webcam features

Related Requirements
Device.Streaming.Webcam.Base.AVStreamWDMAndInterfaceRequirements
Device.Streaming.Webcam.Base.BasicPerf
Device.Streaming.Webcam.Base.DirectShowAndMediaFoundation
Device.Streaming.Webcam.Base.KSCategoryVideoCameraRegistration
Device.Streaming.Webcam.Base.MultipleClientAppSupport
Device.Streaming.Webcam.Base.SurpriseRemoval
Device.Streaming.Webcam.Base.UsageIndicator

Device.Streaming.Webcam.Base.AVStreamWDMAndInterfaceRequirements

Streaming media device driver must be based on AVStream class and WDM, and must meet interface requirements

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

WDM RequirementDevice drivers for any streaming media device must use the AVStream class and Windows Driver Model (WDM) as defined in the Windows Driver Kit. AVStream is the replacement technology for the older stream class driver model, which is outdated and will no longer be enhanced. Drivers for streaming media devices must also support all of the required pins, properties, and settings as defined in the Windows Driver Kit.Note: Peripheral Component Interconnect Express (PCI-e)-based video capture devices must use the AVStream class. USB based devices must be USB Video Class (UVC) compliant as defined in Device.Streaming.Webcam.USBClassDriver.UVCDriver and Device.Streaming.Webcam.USBClassDriver.UVC.Error ConditionsError conditions include (but are not limited to) forced invalid pin connections, invalid property sets, buffers with invalid data, null pointers, and error conditions from drivers above or below on the stack.Interface Requirements: IVideoEncoderAPI and ICodecAPIStreaming media devices that encode video streams must use IVideoEncoderAPI and ICodecAPI. This enables the device driver to uniformly configure hardware and software encoders in the system.To support the Media Center functionality, broadcast receivers must support encoding by using the Video Encoder application programming interface (API). Support for this feature enables Media Center to explicitly set the video data rate for optimal compatibility with DVD burning.Specifically, the device and driver must support VariableBitratePeak mode defined by IVideoEncoderAPI to allow for limiting peak video rate. The device and driver must support setting the ENCAPIPARAM_PEAK_BITRATE property correctly through IVideoEncoderAPI to a value of 8. The ENCAPIRARAM_BITRATE value may be set to any value below the value of ENCAPIPARAM_PEAK_BITRATE to enable variable bit rate (VBR) recordings.The video compression encoder for the tuner hardware must do the following:

Generate DVD-compliant MPEG-2 elementary video streams when the bit rate is set at or below 9 megabits per second (Mbps) VBR. If the software application sets the bit rate above allowable DVD data rates, DVD compliance is not required (Required on x86 and x64 architectures and operating systems only).

Support dynamic bit rate change during run time. The encoder must be capable of dynamically changing the encoding quality bit rate up or down without requiring the Microsoft DirectShow filter to be stopped and restarted by changing the ENCAPIPARAM_BITRATE value. The ENCAPIPARAM_PEAK_BITRATE value is not required to change dynamically.

Support all compression bit rates at least up to a maximum bit rate of 9 Mbps in VariableBitrateAverage mode defined by IVideoEncoderAPI. Higher peak rates are encouraged but are not required.

Support setting ENCAPIPARAM_BITRATE correctly through IVideoEncoderAPI to a value of 2, up to a value of 9. Higher average rates are encouraged but are not required.

To enable Media Center to detect whether a driver can support dynamic bit rate change, the .inf file must contain the GUID. In addition, the driver must place the GUID in the registry. The registry value must be a DWORD with a value of 1 (meaning supported) and 0 or Not Present (meaning unsupported). Add the following in the .inf file under HKR,Capabilities."{BB4FAA02-596C-4129-8FB3-74E75421FA02}", 0x00010001,1[KEY]"{BB4FAA02-596C-4129-8FB3-74E75421FA02}"=dword:1where [KEY] is the HKEY returned by IGetCapabilitiesKey::GetCapabilitiesKey()Design Notes: For implementation details, see "Streaming Devices (Video and Audio)", "AVStream Class minidrivers", and "Stream Class Minidrivers" in the Windows Driver Kit.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.BasicPerf

Captured frame data must be provided within two frame periods

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Video camera hardware must provide captured frame data to the driver within two frame periods of the initiation of capture from the sensor.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.DirectShowAndMediaFoundation

Support for streaming media device must be based on DirectShow or Media Foundation architectures

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Support for streaming media devices must be based on the Microsoft DirectShow or Microsoft Media Foundation. Substitute components may be used instead of DirectShow components that are provided with the operating system. Substitute components must include equivalent functionality based on the components provided with the operating system and must support at least the same inputs and outputs defined for DirectShow.Design NotesDirectShow support is available on x86 and x64 platforms only.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.KSCategoryVideoCameraRegistration

Non-Microsoft webcam driver must register under KSCategory_Video_Camera

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

All non-Microsoft webcam drivers must register under the category for Media Foundations Capture Applications to detect the camera.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.MultipleClientAppSupport

Streaming media device must support multiple client applications and instances by using a single device

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Application SharingThe client applications must be able to open the device and then use the device simultaneously or sequentially, depending on the device's capabilities. The device can allow one application to actively use it while the other application is in a pause or stop state, or the device can allow both applications to use it simultaneously. Multiple InstancesStreaming media devices must be able to control multiple instances of the device with usage of multiple applications. An example is a computer that has three identical USB webcams, each being used by a different application. Another example is a computer that has two TV receivers, each being independently used by two different applications.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.SurpriseRemoval

Removable streaming media device must support surprise removal of that device

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

All hot-pluggable streaming media devices must support their surprise removal from the host bus.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.Base.UsageIndicator

Video capture device must have a visual indicator to indicate usage

Target Feature
Device.Streaming.Webcam.Base
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

A video capture device must have a visual indicator that indicates when it is recording a user. The visual indicator should be on when device is capturing video and off when the device is not capturing video. For system builders who are building systems with Windows 8.1 Update or newer pre-installed who would not like to have a hardware status indicator, the following entry in the registry must be made to enable the Microsoft Windows onscreen camera indicator:

Path: HKLM\SOFTWARE\Microsoft\OEM\Device\Capture

Entry: NoPhysicalCameraLED (REG_DWORD)

0x1: Turn on the feature (= No Physical camera LED on the system)

0x0: Default. Turn off the feature (= Physical camera LED on the system)

For older systems, a hardware status indicator is required.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.H264

Webcam features

Related Requirements
Device.Streaming.Webcam.H264.H264Support

Device.Streaming.Webcam.H264.H264Support

If implemented, H.264 implementation must comply with USB Video Class driver

Target Feature
Device.Streaming.Webcam.H264
Applies to
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

If H.264 format is supported by the device, the implementation must be compliant with the Windows USB Video Class (UVC) driver. Windows driver follows the following spec: https://go.microsoft.com/fwlink/?LinkId=233063At minimum following must be supported:Pins

A preview pin on the same device with uncompressed output type YUY2 and/or NV12

H.264 format must be on a separate video capture pin

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.NonMSDriver

Webcam features

Related Requirements
Device.Streaming.Webcam.NonMSDriver.VideoInfoHeader2

Device.Streaming.Webcam.NonMSDriver.VideoInfoHeader2

Video capture driver must implement VIDEOHEADER2

Target Feature
Device.Streaming.Webcam.NonMSDriver
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Drivers for Windows Driver Model (WDM) streaming video capture devices must implement support for the VIDEOINFOHEADER2 structure. This support indicates video source information such as interlace format, aspect ratio, and color space information (if the device supports capturing of interlaced video, pixel aspect ratios other than 1:1, or nonstandard YCbCr transfer matrix), gamma curve, chromaticity coordinates, or reference black and white levels. Note: This requirement applies to both USB-based and Peripheral Component Interconnect Express (PCI-e)-based video capture devices.Design Notes: For implementation details, see "Streaming Devices" and the AVStream sample capture driver in the Windows Driver Kit.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.USBClassDriver

Webcam features

Related Requirements
Device.Streaming.Webcam.USBClassDriver.UVC
Device.Streaming.Webcam.USBClassDriver.UVCDriver

Device.Streaming.Webcam.USBClassDriver.UVC

USB streaming video camera must comply with USB Video Class specifications

Target Feature
Device.Streaming.Webcam.USBClassDriver
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

USB streaming video cameras must comply with the USB Video Class specifications. At a minimum, all mandatory properties and commands must be implemented. All implemented commands must comply with the specifications. USB streaming video cameras that use MJPEG, or YUY2 for capture, or Digital Video (DV) for capture or render, must also work with the Microsoft-provided USB Video Class driver.

Additional Information

Enforcement Date
Jun. 26, 2013

Device.Streaming.Webcam.USBClassDriver.UVCDriver

UVC-capable device must comply with USB Video Class specifications and work with the Microsoft UVC driver

Target Feature
Device.Streaming.Webcam.USBClassDriver
Applies to
Windows 7 Client x86, x64
Windows 8 Client x86, x64, ARM (Windows RT)
Windows 8.1 Client x86, x64, ARM (Windows RT 8.1)

Description

Devices that are designed to comply with the USB Video Class (UVC) specifications must work with the Microsoft UVC driver. These devices also must comply with the requirements set in the UVC specifications. At a minimum, all mandatory properties and commands must be implemented. All implemented commands must comply with the specifications.

Additional Information

Enforcement Date
Jun. 26, 2013