Kinect Fusion Explorer D2D C++ Sample

Kinect for Windows 1.7, 1.8

This sample illustrates how to use the individual pipeline stages of Kinect Fusion for 3D reconstruction.

Dn188697.note(en-us,IEB.10).gifImportant

DirectX 11 feature support is required to run Kinect Fusion.

In order to determine the DirectX Feature support level of your graphics card, run DXDiag.exe to determine the supported future level.

  1. Launch DxDiag.exe
  2. Navigate to the “Display” tab.
  3. In the “Drivers” area, there will be a text fields with the label “Feature Levels:”
  4. If 11.0 is in the list of supported feature levels, then Kinect Fusion will run in GPU mode.

Note: Simply having DirectX11 installed is not enough, you must also have hardware that supports the DirectX 11.0 feature set.

Overview

When you run this sample, you see the following:

The Sample Uses the Following APIsTo Do This
NuiImageResolutionToSize functionGet the width and height of the depth frame.
NuiGetSensorCount functionGet the number of sensors that are ready for use.
NuiCreateSensorByIndex function and INuiSensor interfaceCreate an interface that represents a connected sensor.
INuiSensor::NuiStatus methodCheck the sensor status to see if the sensor is connected.
INuiSensor::NuiInitialize method and NUI_IMAGE_TYPE_DEPTH_AND_PLAYER_INDEX constantInitialize the sensor to stream out depth data.
NuiFusionCreateReconstruction functionCreate KinectFusion reconstruction volume.
NuiFusionCreateImageFrame functionCreate an image frame for frame data.
NuiFusionCreateImageFrame functionCreate an image frame for point cloud data.
NuiFusionCreateImageFrame function Create images of the raycast volume to display.
INuiSensor::NuiImageStreamGetNextFrame methodGet an extended depth frame from Kinect.
CKinectFusionExplorer.CopyExtendedDepth methodGet extended depth data.
INuiSensor::NuiImageStreamReleaseFrame methodRelease the Kinect camera frame.
NuiFusionShadePointCloud functionShadePointCloud and render.
INuiSensor::NuiImageStreamSetImageFrameFlags method and NUI_IMAGE_STREAM_FLAG_ENABLE_NEAR_MODE constantSet depth data range to near range.
CreateEvent functionCreate an event that will be signaled when depth data is available by returning an event handle.
INuiSensor::NuiImageStreamOpen method, NUI_IMAGE_TYPE_DEPTH constant, NUI_IMAGE_RESOLUTION_640x480 constant, the event handleOpen a depth stream to receive depth data.
INuiSensor::NuiImageStreamGetNextFrame methodGet the next frame of color data (using the color data event handle).
INuiFrameTexture::LockRect method and NUI_LOCKED_RECT structureLock the texture to prepare for saving texture data.
INuiFrameTexture::UnlockRect methodUnlock the texture after saving the texture data.
INuiSensor::NuiImageStreamReleaseFrame methodRelease each frame of depth data after saving it.
INuiSensor::Release methodRelease the sensor when you exit the application.
NuiShutdown functionShut down the sensor.
INuiFusionColorReconstruction::ResetReconstruction method Clear the reconstruction volume and set a new world-to-camera transform (camera view pose) and world-to-volume transform.
INuiFusionColorReconstruction::AlignDepthFloatToReconstruction method Align a depth float image to the reconstruction volume to calculate the new camera pose.
INuiFusionColorReconstruction::GetCurrentWorldToCameraTransform method Retrieve the current internal world-to-camera transform (camera view pose).
INuiFusionColorReconstruction::GetCurrentWorldToVolumeTransform method Get the current internal world-to-volume transform.
INuiFusionColorReconstruction::IntegrateFrame method Integrate depth float data and color data into the reconstruction volume from the specified camera pose.
INuiFusionColorReconstruction::CalculatePointCloud method Calculate a point cloud by raycasting into the reconstruction volume, returning the point cloud containing 3D points and normals of the zero-crossing dense surface at every visible pixel in the image from the specified camera pose and color visualization image.
INuiFusionColorReconstruction::CalculateMesh method Export a polygon mesh of the zero-crossing dense surfaces from the reconstruction volume with per-vertex color.
INuiFusionColorReconstruction::DepthToDepthFloatFrame method Convert the specified array of Kinect depth pixels to a NUI_FUSION_IMAGE_FRAME structure.
INuiFusionColorReconstruction::SmoothDepthFloatFrame method Spatially smooth a depth float image frame using edge-preserving filtering.
INuiFusionColorReconstruction::AlignPointClouds method Align two sets of overlapping oriented point clouds and calculate the camera's relative pose.
INuiFusionColorReconstruction::SetAlignDepthFloatToReconstructionReferenceFrame method Set a reference depth frame that is used internally to help with tracking when calling the AlignDepthFloatToReconstruction method to calculate a new camera pose.
INuiFusionColorReconstruction::CalculatePointCloudAndDepth method Calculate a point cloud by raycasting into the reconstruction volume, returning the point cloud containing 3D points and normals of the zero-crossing dense surface at every visible pixel in the image from the specified camera pose, color visualization image, and the depth to the surface.
INuiFusionCameraPoseFinder::ResetCameraPoseFinder method Clear the INuiFusionCameraPoseFinder.
INuiFusionCameraPoseFinder::ProcessFrame method Add the specified camera frame to the camera pose finder database if the frame differs enough from poses that already exist in the database.
INuiFusionCameraPoseFinder::FindCameraPose method Retrieve the poses in the camera pose finder database that are most similar to the current camera input.
INuiFusionCameraPoseFinder::GetStoredPoseCount method Retrieve the number of frames that are currently stored in the camera pose finder database.
NuiFusionAlignPointClouds function Align two sets of oriented point clouds and calculate the camera's relative pose.

To run a sample you must have the Kinect for Windows SDK installed. To compile a sample, you must have the developer toolkit installed. The latest SDK and developer toolkit are available on the developer download page. If you need help installing the toolkit, look on this page: To Install the SDK and Toolkit. The toolkit includes a sample browser, which you can use to launch a sample or download it to your machine. To open the sample browser, click Start > All Programs > Kinect for Windows SDK [version number] > Developer Toolkit Browser.

If you need help loading a sample in Visual Studio or using Visual Studio to compile, run, or debug, see Opening, Building, and Running Samples in Visual Studio.

Community Additions

ADD
Show: