Share via


Camera Driver Overview (Windows Embedded CE 6.0)

1/6/2010

The Windows Embedded CE Video Camera Device Driver Interface provides basic support for video and still image capture devices.

This documentation is intended for developers writing camera drivers. While it is possible for developers working on camera applications to use this documentation to gain insight about how Windows Embedded CE interacts with camera hardware, camera applications should not communicate directly with the camera driver. Camera applications should only use the middleware layer provided by the DirectShow video capture infrastructure to communicate with and control the camera.

The primary consumer of the camera device driver interface is the DirectShow video capture filter. Unless clearly stated otherwise, when the camera driver documentation refers to the client or application, this really means DirectShow components.

There are few cases where an application might need to reference the camera's device driver interface directly:

  • Device enumeration routines
  • Test and diagnostic software

Camera Characteristics

The following list shows the characteristics of the type of capture device that the Windows Embedded CE camera driver model is designed for.

  • The device captures segments of compressed and uncompressed video stream to a file. The exact duration of capture depends on the storage and processing power available on the device. Although short bursts are typical, the driver architecture allows for recordings of unlimited length.
  • The device supports a live viewfinder to preview the video capture.
  • In addition to full motion video the device also supports still image capture

The Windows Embedded CE camera driver model is designed around small, convenient camera scenarios for broad consumer devices such as cameras integrated into cell phones or PDAs. The driver model is not intended to support advanced capture scenarios such as full-featured digital cameras or video conferencing.

Video ports are not supported. Robust, high-quality still image functionality takes precedence over other features.

While the camera device driver model does not specifically target scenarios with multiple cameras on a single device, it will support multiple cameras as well as pluggable cameras.

Hardware Design

The following table summarizes some general design goals for common camera hardware technologies.

Hardware Description

DMA capability

The camera driver should, where possible, take advantage of whatever DMA capabilities are available on the platform.

DSP support for encoding

To enable real-time encoding on small devices, the camera driver provides interfaces to support hardware-based encoding and decoding.

Asynchronous I/O operation

The camera driver supports asynchronous I/O operations to enable efficient streaming of video frames and maximal parallelism between the CPU and the camera hardware.

Camera control properties

The camera driver should provide an interface to enumerate standard camera properties, such as hue, saturation, etc., to applications. The driver should also provide a way to expose custom property sets for camera devices.

The camera driver model does not support video ports.

Performance Terms

The following table provides descriptions of various terms commonly used in discussing camera performance metrics.

Metric Description

Lag time

The delay between when the shutter is pressed and when the camera actually takes the picture.

Shutter Lag (pre-focused)

The elapsed time between when the shutter is released and when the image is acquired without an auto-focus stage.

Shutter Lag (auto focus)

The elapsed time between when the shutter is released and when the image is acquired with an auto-focus stage.

Image file acquisition delay

The elapsed time between when the shutter is released and when the image file is written and closed.

Preview startup latency

The elapsed time between when the preview window (electronic viewfinder) is started and a live image appears onscreen.

Preview latency

The delay between when an event in the real world occurs and when that event registers in the preview window.

Memory Limitations

The amount of system memory required during the encoding process.

Video Encoding speed

This is the amount of time required to record a video clip. This is the sum of the time required for the event itself plus any encoding time after the event.

Inter-frame delay

The minimum possible time between two consecutive still images.

Rendering times for frames

The amount of time to render one frame onto the display under the following conditions:

  • Color conversion needed
  • Decoding needed
  • Uncompressed RGB

See Also

Other Resources

Camera Driver Development Concepts