Real-time Filter Demo for Windows Phone 8

Real-time Filter Demo is an example app demonstrating the use of the Lumia Imaging SDK for real-time image effects. The effects are applied to the stream received from the camera and shown in the viewfinder. This app does not support capturing photos.

Dn859604.rt-filter-demo-1(en-us,WIN.10).png
Dn859604.rt-filter-demo-2(en-us,WIN.10).png

Compatibility

  • Compatible with Windows Phone 8.
  • Tested with Nokia Lumia 920 and Nokia Lumia 520.
  • Developed with Visual Studio 2013 Express for Windows Phone 8.
  • Compiling the project requires Lumia Imaging SDK.

Design

The user interface (UI) design of the app is very simplistic; the filter is changed using the buttons in the application bar. The filter index and name is shown on the bottom-left corner. The frame rate is shown on the bottom-right corner. The application menu contains a menu item for displaying the about page with information about the example.

Dn859604.rt-filter-demo-3(en-us,WIN.10).png
Dn859604.rt-filter-demo-4(en-us,WIN.10).png

Dn859604.rt-filter-demo-5(en-us,WIN.10).png
Dn859604.rt-filter-demo-7(en-us,WIN.10).png

Architecture overview 

The example consists of three key classes. The main page is your typical phone application page implemented by a XAML file and a C# counterpart. The main page implements the application UI including the MediaElement, which displays the camera viewfinder with an effect. The MainPage class also owns the instances of the two other key classes: CameraStreamSource and Effects. The CameraStreamSource, derived from MediaStreamSource, provides the camera data. Effects implements all the effects of the application.

Managing the camera stream

The camera stream is managed by the CameraStreamSource class. The GetSampleAsync(...) method uses Effects.GetNewFrameAndApplyEffect(...) to get the modified camera buffer.

// ...

public class CameraStreamSource
{
    private readonly Dictionary<MediaSampleAttributeKeys, string> _emptyAttributes =
        new Dictionary<MediaSampleAttributeKeys, string>();

    private MediaStreamDescription _videoStreamDescription = null;
    private MemoryStream _frameStream = null;
    private ICameraEffect _cameraEffect = null;
    private long _currentTime = 0;
    private int _frameStreamOffset = 0;
    private int _frameTime = 0;
    private int _frameCount = 0;
    private Size _frameSize = new Size(0, 0);
    private int _frameBufferSize = 0;
    private byte[] _frameBuffer = null;

    // ...

    protected override void OpenMediaAsync()
    {
        // Member variables are initialized here

        // ...
    }

    protected override void GetSampleAsync(MediaStreamType mediaStreamType)
    {
        var task = _cameraEffect.GetNewFrameAndApplyEffect(_frameBuffer.AsBuffer(), _frameSize);
        
        // When asynchroneous call completes, proceed by reporting about the sample completion

        task.ContinueWith((action) =>
        {
            _frameStream.Position = 0;
            _currentTime += _frameTime;
            _frameCount++;

            var sample = new MediaStreamSample(_videoStreamDescription, _frameStream, _frameStreamOffset,
                                               _frameBufferSize, _currentTime, _emptyAttributes);

            ReportGetSampleCompleted(sample);
        }
    }

    // ...
}

Linking the stream to the media element on the screen

Displaying the manipulated camera buffer on the screen is handled by the MainPage class. In the XAML declaration of the class, VideoBrush renders the LayoutRoot grid's background.

<Grid x:Name="LayoutRoot" Tap="LayoutRoot_Tap">
    <Grid.Background>
        <VideoBrush x:Name="BackgroundVideoBrush"/>
    </Grid.Background>

    // ...

</Grid>

In the C# code, the custom CameraStreamSource is set as the source for MediaElement, and then MediaElement is set as a source for the VideoBrush:

// ...

public class MainPage : PhoneApplicationPage
{
    private MediaElement _mediaElement = null;
    private CameraStreamSource _cameraStreamSource = null;
    
    // ...

    private async void Initialize()
    {
        // Camera stream source is initialized here

        // ...

        _mediaElement = new MediaElement();
        _mediaElement.Stretch = Stretch.UniformToFill;
        _mediaElement.BufferingTime = new TimeSpan(0);
        _mediaElement.SetSource(_cameraStreamSource);

        BackgroundVideoBrush.SetSource(_mediaElement);

        // ...
    }

    // ...
}

Applying the effect

The effect is applied by the GetNewFrameAndApplyEffect(...) method in Effects class.

First you query the image data from the camera device as a buffer with?PhotoCaptureDevice.GetPreviewBufferArgb(). Then you create a new bitmap that is associated with the processedBuffer argument. Finally, you create an EditingSession for the camera buffer, apply an effect, and render it into the new buffer.

public class Effects
{
    private PhotoCaptureDevice _photoCaptureDevice = null;
    private CameraPreviewImageSource _cameraPreviewImageSource = null;
    private FilterEffect _filterEffect = null;
    private CustomEffectBase _customEffect = null;
    private Semaphore _semaphore = new Semaphore(1, 1);

    // ...

    public PhotoCaptureDevice CaptureDevice
    {
        set
        {
            if (_photoCaptureDevice != value)
            {
                while (!_semaphore.WaitOne(100));

                _photoCaptureDevice = value;

                Initialize();

                _semaphore.Release();
            }
        }
    }

    // ...

    public async Task GetNewFrameAndApplyEffect(IBuffer frameBuffer, Size frameSize)
    {
        if (_semaphore.WaitOne(500))
        {
            _cameraPreviewImageSource.InvalidateLoad(); // Invalidate camera frame

            var scanlineByteSize = (uint)frameSize.Width * 4; // 4 bytes per pixel in BGRA888 mode
            var bitmap = new Bitmap(frameSize, ColorMode.Bgra8888, scanlineByteSize, frameBuffer);

            if (_filterEffect != null)
            {
                var renderer = new BitmapRenderer(_filterEffect, bitmap);
                await renderer.RenderAsync();
            }
            else if (_customEffect != null)
            {
                var renderer = new BitmapRenderer(_customEffect, bitmap);
                await renderer.RenderAsync();
            }

            // ...

            _semaphore.Release();
        }
    }

    private void Initialize()
    {
        _cameraPreviewImageSource = new CameraPreviewImageSource(_photoCaptureDevice);
      
        var filters = new List<IFilter>();

        // ...

        switch (_effectIndex)
        {
            case 0:
                {
                    filters.Add(new LomoFilter(0.5, 0.5, LomoVignetting.High, LomoStyle.Yellow));
                }
                break;

            // ...

            case 10:
                {
                    _customEffect = new CustomEffect(_cameraPreviewImageSource);
                }
                break;
        }

        if (filters.Count > 0)
        {
            _filterEffect = new FilterEffect(_cameraPreviewImageSource)
            {
                Filters = filters
            };
        }
    }

    // ...
}

Implementing custom effects

It is also easy to implement custom effects that can be used instead of the FilterEffect that comes with the Lumia Imaging SDK. It is important to note that custom effects are not instances of IFilter used with FilterEffect, but they are instead derived from CustomEffectBase that is on the same hierarchy level as FilterEffect and thus meant to be used in similar manner in the rendering workflow.

Here's a simple custom effect that has a weighted inverse grayscale effect on the image.

public class CustomEffect : CustomEffectBase
{
    // ...

    public CustomEffect(IImageProvider source) : base(source)
    {
    }

    protected override void OnProcess(PixelRegion sourcePixelRegion, PixelRegion targetPixelRegion)
    {
        var sourcePixels = sourcePixelRegion.ImagePixels;
        var targetPixels = targetPixelRegion.ImagePixels;

        sourcePixelRegion.ForEachRow((index, width, position) =>
        {
            for (int x = 0; x < width; ++x, ++index)
            {
                // the only supported color format is ColorFormat.Bgra8888

                uint pixel = sourcePixels[index];
                uint blue = pixel & 0x000000ff; // blue color component
                uint green = (pixel & 0x0000ff00) >> 8; // green color component
                uint red = (pixel & 0x00ff0000) >> 16; // red color component
                uint average = (uint)(0.0722 * blue + 0.7152 * green + 0.2126 * red); // weighted average component
                uint grayscale = 0xff000000 | average | (average << 8) | (average << 16); // use average for each color component

                targetPixels[index] = ~grayscale; // use inverse grayscale
            }
        });
    }
}

See the CustomEffectBase section under the Core concepts page for another example and more details.

Downloads

Real-time Filter Demo source codereal-time-filter-demo-master.zip

This example application is hosted in GitHub, where you can check the latest activities, report issues, browse source, ask questions, or even contribute to the project yourself.

Show: