How to work with grayscale in a camera app for Windows Phone 8

Applies to: Windows Phone 8 and Windows Phone Silverlight 8.1 | Windows Phone OS 7.1

Starting with Windows Phone OS 7.1, you can programmatically access the phone’s camera using the Microsoft.Devices.PhotoCamera class. This topic describes how to alter live video frames from the camera preview buffer. The app described in this topic demonstrates how to process alpha, red, green, and blue (ARGB) frames from the camera and convert them to grayscale. This topic corresponds to the Camera Grayscale Sample.

TipTip:

If your Windows Phone 8 app needs to process grayscale frames, consider using the GetPreviewBufferY(Byte[]) method. This method uses the efficient YCbCr format to capture only the luminance (Y) information from the camera preview buffer. For more info about using PhotoCaptureDevice class, see Advanced photo capture for Windows Phone 8.

This topic is divided into two parts:

Important noteImportant Note:

When upgrading Windows Phone OS 7.0 apps to use the capabilities in Windows Phone OS 7.1, the camera capability ID_CAP_ISV_CAMERA is not automatically added to the app manifest file, WMAppManifest.xml. Without ID_CAP_ISV_CAMERA, apps using the camera API won’t function. In new Windows Phone OS 7.1 projects, this capability is included in the app manifest file.

The following image illustrates the camera app created in this topic.

AP_Con_CameraGrayscale

In this section, you create the camera UI, which consists of a viewfinder region, a button StackPanel control for toggling between both color and grayscale modes, and an Image control that overlays the viewfinder region for grayscale viewing.

To create the camera UI and base functionality

  1. Using the Windows Phone SDK, create a new project using the Windows Phone App template.

  2. After your project has been created, on the Project menu, select Add Reference. On the .NET tab, choose Microsoft.XNA.Framework, and then click OK.

  3. In the MainPage.xaml file, update the phone:PhoneApplicationPage element as shown in the following code.

        SupportedOrientations="Landscape" Orientation="LandscapeLeft"
        shell:SystemTray.IsVisible="False"
    
    

    This configures the page for landscape orientation and hides the system tray.

  4. On MainPage.xaml, replace the Grid named LayoutRoot with the following code.

        <!--LayoutRoot is the root grid where all page content is placed-->
        <Grid x:Name="LayoutRoot" Background="Transparent">
            <Grid.ColumnDefinitions>
                <ColumnDefinition Width="640" />
                <ColumnDefinition Width="160*" />
            </Grid.ColumnDefinitions>
    
            <!--Camera viewfinder >-->
            <Rectangle Width="640" Height="480" HorizontalAlignment="Left" >
                <Rectangle.Fill>
                    <VideoBrush x:Name="viewfinderBrush" />
                </Rectangle.Fill>
    
            </Rectangle>
    
            <!--Overlay for the viewfinder region to display grayscale WriteableBitmap objects-->
            <Image x:Name="MainImage" 
                   Width="320" Height="240" 
                   HorizontalAlignment="Left" VerticalAlignment="Bottom"  
                   Margin="16,0,0,16"
                   Stretch="Uniform"/>
    
            <!--Button StackPanel to the right of viewfinder>-->
            <StackPanel Grid.Column="1" >
                <Button             
                    Content="Gray: ON"
                    Name="GrayscaleOnButton"  
                    Click="GrayOn_Clicked" />
                <Button             
                    Content="Gray: OFF"
                    Name="GrayscaleOffButton"  
                    Click="GrayOff_Clicked" />
            </StackPanel>
    
            <!--Used for debugging >-->
            <TextBlock Height="40" HorizontalAlignment="Left" Margin="8,428,0,0" Name="txtDebug" VerticalAlignment="Top" Width="626" FontSize="24" FontWeight="ExtraBold" />
    
        </Grid>
    
    

    The code creates a 640×480 viewfinder region that has a StackPanel control to contain the grayscale on and off buttons. Again, the purpose of the Image control is to overlay the viewfinder region when grayscale mode is selected. This creates an alternate viewing region for grayscale WriteableBitmap objects provided by the frame pump.

  5. Open MainPage.xaml.cs, the code-behind file for the main page, and then add the following directives at the top of the page.

    // Directives
    using Microsoft.Devices;
    using System.Windows.Media.Imaging;
    using System.Threading;
    
    
  6. In MainPage.xaml.cs, in the MainPage class, add the following variable declarations before the MainPage class constructor.

        // Variables
        PhotoCamera cam = new PhotoCamera();
        private static ManualResetEvent pauseFramesEvent = new ManualResetEvent(true);
        private WriteableBitmap wb;
        private Thread ARGBFramesThread;
        private bool pumpARGBFrames;
    
    
  7. In MainPage.xaml.cs, add the following code to the MainPage class.

    NoteNote:

    Until you complete the following steps of this procedure, Visual Studio may list errors about methods that do not exist in the current context. These methods will be added in the following steps.

            //Code for camera initialization event, and setting the source for the viewfinder
            protected override void OnNavigatedTo(System.Windows.Navigation.NavigationEventArgs e)
            {
    
                // Check to see if the camera is available on the phone.
                if ((PhotoCamera.IsCameraTypeSupported(CameraType.Primary) == true) ||
                     (PhotoCamera.IsCameraTypeSupported(CameraType.FrontFacing) == true))
                {
                    // Initialize the default camera.
                    cam = new Microsoft.Devices.PhotoCamera();
    
                    //Event is fired when the PhotoCamera object has been initialized
                    cam.Initialized += new EventHandler<Microsoft.Devices.CameraOperationCompletedEventArgs>(cam_Initialized);
    
                    //Set the VideoBrush source to the camera
                    viewfinderBrush.SetSource(cam);
                }
                else
                {
                    // The camera is not supported on the phone.
                    this.Dispatcher.BeginInvoke(delegate()
                    {
                        // Write message.
                        txtDebug.Text = "A Camera is not available on this phone.";
                    });
    
                    // Disable UI.
                    GrayscaleOnButton.IsEnabled = false;
                    GrayscaleOffButton.IsEnabled = false;
                }
            }
    
            protected override void OnNavigatingFrom(System.Windows.Navigation.NavigatingCancelEventArgs e)
            {
                if (cam != null)
                {
                    // Dispose of the camera to minimize power consumption and to expedite shutdown.
                    cam.Dispose();
    
                    // Release memory, ensure garbage collection.
                    cam.Initialized -= cam_Initialized;
                }
            }
    
    

    This code uses the OnNavigatedTo(NavigationEventArgs) method to create a PhotoCamera object named cam and add an event handler. This code also sets the VideoBrush source to the phone camera object, cam. If a camera is not available on the phone, the buttons are disabled and a message is displayed in the UI.

    NoteNote:

    To pull video frames from the camera, as shown later with the GetPreviewBufferArgb32(Int32[]) method, the PhotoCamera object needs to be set to display a video preview to a VideoBrush control.

  8. In MainPage.xaml.cs, add the following code to the MainPage class.

        protected override void OnNavigatingFrom(System.Windows.Navigation.NavigatingCancelEventArgs e)
        {
            // Dispose camera to minimize power consumption and to expedite shutdown.
            cam.Dispose();
    
            // Release memory, ensure garbage collection.
            cam.Initialized -= cam_Initialized;
        }
    
    

    This code helps release memory related to the camera.

  9. In MainPage.xaml.cs, add the following code to the MainPage class.

    //Update UI if initialization succeeds
            void cam_Initialized(object sender, Microsoft.Devices.CameraOperationCompletedEventArgs e)
            {        
                if (e.Succeeded)
                {
                    this.Dispatcher.BeginInvoke(delegate()
                    {
                        txtDebug.Text = "Camera initialized";
                    });
                   
                }
            }
    
    

    This code uses the camera Initialized event to update the TextBlock named txtDebug. The BeginInvoke method is required for updating the status because the app UI runs on a different thread.

  10. To create a camera app, the camera capability must be declared in the app manifest file. Without it, the app will not function. Open WMAppManifest.xml and confirm that the following capabilities element is present.

    <Capability Name="ID_CAP_ISV_CAMERA"/>
    

    For more info about app capabilities and requirements, see App capabilities and hardware requirements for Windows Phone 8.

In this section, you create two methods that are collectively responsible for taking ARGB (Alpha, Red, Green, Blue) frames from the camera and converting them to grayscale.

To create the ARGB frame pump

  1. In MainPage.xaml.cs, add the following code to the MainPage class.

        // ARGB frame pump
        void PumpARGBFrames()
        {
            // Create capture buffer.
            int[] ARGBPx = new int[(int) cam.PreviewResolution.Width * (int) cam.PreviewResolution.Height];
    
            try
            {
                PhotoCamera phCam = (PhotoCamera)cam;
    
                while (pumpARGBFrames)
                {
                    pauseFramesEvent.WaitOne();
    
                    // Copies the current viewfinder frame into a buffer for further manipulation.
                    phCam.GetPreviewBufferArgb32(ARGBPx);
    
                    // Conversion to grayscale.
                    for (int i = 0; i < ARGBPx.Length; i++)
                    {
                        ARGBPx[i] = ColorToGray(ARGBPx[i]);
                    }
    
                    pauseFramesEvent.Reset();
                    Deployment.Current.Dispatcher.BeginInvoke(delegate()
                    {
                        // Copy to WriteableBitmap.
                        ARGBPx.CopyTo(wb.Pixels, 0);
                        wb.Invalidate();
    
                        pauseFramesEvent.Set();
                    });
                }
    
            }
            catch (Exception e)
            {
                this.Dispatcher.BeginInvoke(delegate()
                {
                    // Display error message.
                    txtDebug.Text = e.Message;
                });
            }
        }
    
        internal int ColorToGray(int color)
        {
            int gray = 0;
    
            int a = color >> 24;
            int r = (color & 0x00ff0000) >> 16;
            int g = (color & 0x0000ff00) >> 8;
            int b = (color & 0x000000ff);
    
            if ((r == g) && (g == b))
            {
                gray = color;
            }
            else
            {
                // Calculate for the illumination.
                // I =(int)(0.109375*R + 0.59375*G + 0.296875*B + 0.5)
                int i = (7 * r + 38 * g + 19 * b + 32) >> 6;
    
                gray = ((a & 0xFF) << 24) | ((i & 0xFF) << 16) | ((i & 0xFF) << 8) | (i & 0xFF);
            }
            return gray;
        }
    
    

    In this code, a method named PumpARGBFrames (the ARGB frame pump), copies ARGB frames into a buffer for manipulation. Then, while each frame is in the buffer, it uses the ColorToGray method to convert the frame to grayscale. Finally, PumpARGBFrames copies the converted frame to a WriteableBitmap object named wb. The continuous processing of the frame pump is used to produce a live black-and-white video that is displayed in the UI.

    NoteNote:

    To pull video frames from the camera, as shown with the GetPreviewBufferArgb32(Int32[]) method, the PhotoCamera object needs to be set to display a video preview to a VideoBrush control.

  2. In MainPage.xaml.cs, add the following code to the MainPage class.

        // Start ARGB to grayscale pump.
        private void GrayOn_Clicked(object sender, RoutedEventArgs e)
        {
            MainImage.Visibility = Visibility.Visible;
            pumpARGBFrames = true;
            ARGBFramesThread = new System.Threading.Thread(PumpARGBFrames);
    
            wb = new WriteableBitmap((int) cam.PreviewResolution.Width, (int) cam.PreviewResolution.Height);
            this.MainImage.Source = wb;
    
            // Start pump.
            ARGBFramesThread.Start();
            this.Dispatcher.BeginInvoke(delegate()
            {
                txtDebug.Text = "ARGB to Grayscale";
            });
        }
    
        // Stop ARGB to grayscale pump.
        private void GrayOff_Clicked(object sender, RoutedEventArgs e)
        {
            MainImage.Visibility = Visibility.Collapsed;
            pumpARGBFrames = false;
    
            this.Dispatcher.BeginInvoke(delegate()
            {
                txtDebug.Text = "";
            });
        }
    
    

    In this code, the GrayOn_Clicked method resizes the Image control named MainImage to cover the screen and sets it to display the writeable bitmap named wb, which is updated by the ARGB frame pump. The GrayOff_Clicked event collapses the MainImage control and stops the frame pump.

    NoteNote:

    Notice that the Image control visibility is collapsed in certain situations. Again, this control overlays the standard color viewfinder implementation. It is active and visible for grayscale viewing.

  3. On a phone, run the app by selecting the Debug | Start Debugging menu command.

Show:
© 2014 Microsoft