IDXGIDevice1::GetMaximumFrameLatency method (dxgi.h)

Gets the number of frames that the system is allowed to queue for rendering.

Syntax

HRESULT GetMaximumFrameLatency(
  [out] UINT *pMaxLatency
);

Parameters

[out] pMaxLatency

Type: UINT*

This value is set to the number of frames that can be queued for render.
This value defaults to 3, but can range from 1 to 16.

Return value

Type: HRESULT

Returns S_OK if successful; otherwise, returns one of the following members of the D3DERR enumerated type:

  • D3DERR_DEVICELOST
  • D3DERR_DEVICEREMOVED
  • D3DERR_DRIVERINTERNALERROR
  • D3DERR_INVALIDCALL
  • D3DERR_OUTOFVIDEOMEMORY

Remarks

This method is not supported by DXGI 1.0, which shipped in Windows Vista and Windows Server 2008. DXGI 1.1 support is required, which is available on Windows 7, Windows Server 2008 R2, and as an update to Windows Vista with Service Pack 2 (SP2) (KB 971644) and Windows Server 2008 (KB 971512).

Frame latency is the number of frames that are allowed to be stored in a queue before submission for rendering. Latency is often used to control how the CPU chooses between responding to user input and frames that are in the render queue. It is often beneficial for applications that have no user input (for example, video playback) to queue more than 3 frames of data.

Requirements

Requirement Value
Minimum supported client Windows 7 [desktop apps | UWP apps]
Minimum supported server Windows Server 2008 R2 [desktop apps | UWP apps]
Target Platform Windows
Header dxgi.h
Library DXGI.lib

See also

DXGI Interfaces

IDXGIDevice1

IDXGIDevice1::SetMaximumFrameLatency