RenderState.SlopeScaleDepthBias Property

Gets or sets a value used to determine how much bias can be applied to coplanar primitives to reduce flimmering z-fighting. The default is 0.

Namespace: Microsoft.Xna.Framework.Graphics
Assembly: Microsoft.Xna.Framework (in microsoft.xna.framework.dll)

public float SlopeScaleDepthBias { get; set; }

Property Value

Value that specifies the slope scale bias to apply.

Polygons that are coplanar in your 3D space can be made to appear as if they are not coplanar by adding a z-bias to each one. An application can help ensure that coplanar polygons are rendered properly by adding a bias to the z-values that the system uses when rendering sets of coplanar polygons.

The following formula shows how to calculate the bias to be applied to coplanar primitives.

bias = (m × SlopeScaleDepthBias) + DepthBias

Where m is the maximum depth slope of the triangle being rendered, defined as:

m = max( abs(delta z / delta x), abs(delta z / delta y) )

Xbox 360, Windows XP SP2, Windows Vista

Community Additions