IReferenceClockTimerControl::SetDefaultTimerResolution method

The SetDefaultTimerResolution method sets the minimum timer resolution.


HRESULT SetDefaultTimerResolution(
  [in] REFERENCE_TIME timerResolution


timerResolution [in]

Minimum timer resolution, in 100-nanosecond units. If the value is zero, the reference clock cancels its previous request.

Return value

Returns an HRESULT value. Possible values include the following.

Return codeDescription




The reference clock attempts to set the period of the timer to timerResolution. The actual period of the timer might differ, depending on the hardware. To find the minimum and maximum timer resolution, call the timeGetDevCaps function. The reference clock sets the timer resolution is set by calling timeBeginPeriod. If timerResolution is 0, the method cancels the previous timer request by calling timeEndPeriod. (When the reference clock is destroyed, it automatically cancels any previous request.)

If this method is not called, the reference clock sets the timer resolution to 1 millisecond. To get the best power management performance, it is recommended that you call this method with the value zero. This overrides the clock's default setting of 1 millisecond. If any filters in the graph require a higher timer resolution, they can call timeBeginPeriod individually. Typically only renderers should require a particular timer resolution.


Minimum supported client

Windows Vista [desktop apps only]

Minimum supported server

Windows Server 2008 [desktop apps only]


Strmif.h (include Dshow.h)



See also

Error and Success Codes
IReferenceClockTimerControl Interface