Robotics Tutorial 7 (C#) - Speech and Vision in Robots

Glossary Item Box

Microsoft Robotics Developer Studio Send feedback on this topic

Robotics Tutorial 7 (C#) - Speech and Vision in Robots

The Speech and Vision sample shows how to integrate speech to drive a robot and how to use the Blob Tracker in order to make a follow me service, using a mounted camera in the robot.

This tutorial is provided in the C# language. You can find the project files for this tutorial at the following location under the Microsoft Robotics Developer Studio installation folder:

Samples\RoboticsTutorials\Tutorial7\CSharp

This tutorial teaches you how to:

  • Create Partners.
  • Create Event Handlers.
  • Using Manifest and Config Files.
  • Start and Run the Tutorial.

See Also:

  • Getting Started

Prerequisites

Hardware

This tutorial is designed for use with an IRobot Create with a mounted camera. A microphone is also required, to enable Speech Recognition. An XBox controller or similar, is required to drive the robot using either voice commands or controller commands.

Software

This tutorial is designed for use with Microsoft Visual C#. You can use:

  • Microsoft Visual C# Express Edition
  • Microsoft Visual Studio Standard, Professional, or Team Edition.

You will also need Microsoft Internet Explorer or another conventional web browser and the .NET framework version 3.0 or later for speech recognition.

Getting Started

The service for this tutorial already exists and is available under the setup install directory in the samples\RoboticsTutorials\Tutorial7\CSharp\ folder.

Begin by loading RoboticsTutorial7.csproj in Microsoft Visual Studio (VPL).

After you have built the project, you can run it from the DSS Command Prompt:

dsshost /port:50000 /manifest:"samples\config\RoboticsTutorial7.manifest.xml"
Bb608247.hs-note(en-us,MSDN.10).gif

This manifest loads services specific to IRobot Create. If you want to use a different platform, you must change the manifest. Alternatively, you can start services manually from the Control Panel.

You can also setup Microsoft Porgramming Language (VPL) to start the sample from within the program:

  1. In the Solution Explorer double click on the projects Properties folder.
  2. In the Debug tab choose "Start external program" and specify dsshost.exe
    You can find this file in the bin subdirectory of the setup install directory. For example, c:\Microsoft RDS\bin\dsshost.exe.
  3. Use /port:50000 /manifest:"samples\config\RoboticsTutorial7.manifest.xml" as command line option and the setup installation folder as working directory.

You can now run the sample by pressing F5 or selecting the Debug > Start Debugging menu command.

Step 1: Create Partners

The Robotics Tutorial 7 service need four partners:

  • SpeechRecognizer
  • Drive
  • XInputGamePad
  • BlobTracker

The service will use the Speech Recognizer to recognize speech commands, and the Drive service to drive the robot. The XInputGamePad is required to drive the robot using an input controller. The BlobTracker is required to enable the "follow me" functionality.

The code for the partnership goes as follows:

[Partner("SpeechRecognizer", Contract = sr.Contract.Identifier, CreationPolicy = PartnerCreationPolicy.UseExisting)]
sr.SpeechRecognizerOperations _srPort = new sr.SpeechRecognizerOperations();
sr.SpeechRecognizerOperations _srNotify = new sr.SpeechRecognizerOperations();

[Partner("Drive", Contract = drive.Contract.Identifier, CreationPolicy = PartnerCreationPolicy.UseExisting)]
drive.DriveOperations _drivePort = new drive.DriveOperations();

[Partner("XInputGamePad", Contract = xinput.Contract.Identifier, CreationPolicy = PartnerCreationPolicy.CreateAlways)]
xinput.XInputGamepadOperations _xinputPort = new xinput.XInputGamepadOperations();
xinput.XInputGamepadOperations _xinputNotify = new xinput.XInputGamepadOperations();

[Partner("BlobTracker", Contract = blob.Contract.Identifier, CreationPolicy = PartnerCreationPolicy.UseExisting)]
blob.BlobTrackerOperations _blobPort = new blob.BlobTrackerOperations();
blob.BlobTrackerOperations _blobNotify = new blob.BlobTrackerOperations();

Step 2: Create Event Handlers

In order to manage the event notifications, a subscription to the necessary services and create handle methods for each notification received.

The code inside the Start() service method follows:

_srPort.Subscribe(_srNotify);
_xinputPort.Subscribe(_xinputNotify);
_blobPort.Subscribe(_blobNotify);

Activate<ITask>(
    Arbiter.Receive<sr.SpeechRecognized>(true, _srNotify, OnSRRecognition),
    Arbiter.Receive<xinput.ButtonsChanged>(true, _xinputNotify, OnButtonsChanged),
    Arbiter.Receive<blob.ImageProcessed>(true, _blobNotify, OnImageProcessed));

And for each of the notifications there is a method related.

The OnSRRecognition method is in charge of receiving a recognition notification of some specific command. The code goes as follows:

void OnSRRecognition(sr.SpeechRecognized recognition)
{
    if (recognition.Body.Semantics != null)
    {
        switch (recognition.Body.Semantics.ValueString)
        {
            case "Forward":
                _drivePort.SetDrivePower(0.5, 0.5);
                break;
            case "Backward":
                _drivePort.SetDrivePower(-0.5, -0.5);
                break;
            case "Left":
                _drivePort.SetDrivePower(-0.3, 0.3);
                break;
            case "Right":
                _drivePort.SetDrivePower(0.3, -0.3);
                break;
            case "Stop":
                _followMe = false;
                _drivePort.SetDrivePower(0.0, 0.0);
                break;
            case "FollowMe":
                _followMe = !_followMe;
                if (!_followMe)
                {
                    _drivePort.SetDrivePower(0.0, 0.0);
                }
                break;
        }
    }
}

The OnButtonsChanged method does the same as the OnSRRecognition method, but refers to the buttons pressed on the controller.

The OnImageProcessed method will handle the notification of an image processed by the Blob Tracker and is used for the follow me functionality. In this case if the Blob Tracker, tracks an area of more than an X quantity, then the robot will be drive to that object, if not the robot will make turns seeking for the correct tracking. The code follows:

void OnImageProcessed(blob.ImageProcessed imageProcessed)
{
    if (_followMe)
    {
        if (imageProcessed.Body.Results.Count == 1)
        {
            if (imageProcessed.Body.Results[0].Area > 100) //object detected
            {
                _drivePort.SetDrivePower(0.5, 0.5);
            }
            else //search object
            {
                _drivePort.SetDrivePower(-0.3, 0.3);
            }
        }
    }
}

Step 3: Using Manifest and Config Files

Now that the service is created, it's necessary to write the correct manifest and the config files in order to set the initial states.

The manifest will have to refer to the partners that have the following creation policy:

CreationPolicy = PartnerCreationPolicy.UseExisting

So they will be referred in the manifest as follows.

For the Speech Recognizer:

<ServiceRecordType>
  <dssp:Contract>https://schemas.microsoft.com/robotics/2008/02/speechrecognizer.html</dssp:Contract>
  <dssp:PartnerList>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2008/02/speechrecognizer.html</dssp:Contract>
      <dssp:Service>RoboticsTutorial7SpeechRecognizer.Config.xml</dssp:Service>
      <dssp:PartnerList />
      <dssp:Name>dssp:StateService</dssp:Name>
    </dssp:Partner>
  </dssp:PartnerList>
  <Name>this:speechrecognizer</Name>
</ServiceRecordType>

For the IRobot Create:

<ServiceRecordType>
  <dssp:Contract>https://schemas.microsoft.com/robotics/2007/01/irobot.html</dssp:Contract>
  <dssp:PartnerList>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2007/01/irobot.html</dssp:Contract>
      <dssp:Service>RoboticsTutorial7iRobotCreateRoombaHardware.Config.xml</dssp:Service>
      <dssp:PartnerList />
      <dssp:Name>dssp:StateService</dssp:Name>
    </dssp:Partner>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/generic/2006/12/dssstream.html</dssp:Contract>
      <dssp:PartnerList />
      <dssp:Name>irobot:irobotstream</dssp:Name>
      <dssp:ServiceName>this:iRobotInternalCommunications</dssp:ServiceName>
    </dssp:Partner>
  </dssp:PartnerList>
  <Name>this:irobot</Name>
</ServiceRecordType>
<ServiceRecordType>
  <dssp:Contract>https://schemas.microsoft.com/robotics/2006/12/irobot/drive.html</dssp:Contract>
  <dssp:PartnerList>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2007/02/irobotlite.html</dssp:Contract>
      <dssp:PartnerList />
      <dssp:Name>drive:iRobotUpdates</dssp:Name>
      <dssp:ServiceName>this:irobot</dssp:ServiceName>
    </dssp:Partner>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2007/01/irobot/create.html</dssp:Contract>
      <dssp:PartnerList />
      <dssp:Name>drive:Create</dssp:Name>
      <dssp:ServiceName>this:irobot</dssp:ServiceName>
    </dssp:Partner>
  </dssp:PartnerList>
  <Name>this:drive</Name>
</ServiceRecordType>

Finally for the Blob Tracker:

<ServiceRecordType>
  <dssp:Contract>https://schemas.microsoft.com/robotics/2007/03/blobtracker.html</dssp:Contract>
  <dssp:PartnerList>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2007/03/blobtracker.html</dssp:Contract>
      <dssp:Service>RoboticsTutorial7BlobTracker.Config.xml</dssp:Service>
      <dssp:PartnerList />
      <dssp:Name>dssp:StateService</dssp:Name>
    </dssp:Partner>
    <dssp:Partner>
      <dssp:Contract>https://schemas.microsoft.com/robotics/2006/05/webcamservice.html</dssp:Contract>
      <dssp:PartnerList />
      <dssp:Name>blobtracker:WebCam</dssp:Name>
      <dssp:ServiceName>this:WebCam</dssp:ServiceName>
    </dssp:Partner>
  </dssp:PartnerList>
  <Name>this:blobtracker</Name>
</ServiceRecordType>
<ServiceRecordType>
  <dssp:Contract>https://schemas.microsoft.com/robotics/2006/05/multidevicewebcamservice.html</dssp:Contract>
  <dssp:PartnerList />
  <Name>this:WebCam</Name>
</ServiceRecordType>

In order to better understand the manifest, you can open it using the Decentralized Software Services (DSS) Manifest Editor.

Related to this services a config file for each service has to be provided too.

This config files can be edited using the DSS Manifest Editor.

In any case, the config files for this project should look like this.

For the SpeechRecognizer, it is necessary to provide the commands that the tutorial will use. The semantic values that are used are: Left, Right, Forward, Backward, Stop, FollowMe. In order to predefine these commands, the config file will look like this:

<?xml version="1.0" encoding="utf-8"?>
<SpeechRecognizerState xmlns="https://schemas.microsoft.com/robotics/2008/02/speechrecognizer.html">
  <DictionaryGrammar>
    <Elem>
      <string >Backward</string>
      <string >Backward</string>
    </Elem>
    <Elem>
      <string >Follow me</string>
      <string >FollowMe</string>
    </Elem>
    <Elem>
      <string >Forward</string>
      <string >Forward</string>
    </Elem>
    <Elem>
      <string >Left</string>
      <string >Left</string>
    </Elem>
    <Elem>
      <string >Right</string>
      <string >Right</string>
    </Elem>
    <Elem>
      <string >Stop moving</string>
      <string >Stop</string>
    </Elem>
  </DictionaryGrammar>
  <IgnoreAudioInput>false</IgnoreAudioInput>
  <GrammarType>DictionaryStyle</GrammarType>
</SpeechRecognizerState>

Another way to create these kinds of commands consists of using the web interface of the SpeechRecognizerGui service, in which you can add the text that you want to be recognized and give that text a semantic value, matching the ones mentioned before.

Regarding to the IRobot Create, the config file will determine which port and type of connection it will be using to create the communications. If a config file is not provided, a pop-up window of the web browser will be displayed and you will be able to configure it there.

If a config file is provided, the XML code looks like this, if you are using a serial port and the port is number 3:

<RoombaState xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:wsa="https://schemas.xmlsoap.org/ws/2004/08/addressing" xmlns:d="https://schemas.microsoft.com/xw/2004/10/dssp.html" xmlns="https://schemas.microsoft.com/robotics/2007/01/irobot.html">
  <BaudRate>57600</BaudRate>
  <Name>Sunny</Name>
  <FirmwareDate>0001-01-01T00:00:00</FirmwareDate>
  <SerialPort>3</SerialPort>
  <IRobotModel>Create</IRobotModel>
  <ConnectionType>CreateSerialPort</ConnectionType>
  <Mode>Uninitialized</Mode>
  <MaintainMode>Off</MaintainMode>
<...>
</RoombaState>

Finally, for the BlobTracker to determine the color it will track, the config file will need this information. If the config file is not provided, using the DSS Manifest Editor, select the BlobTracker, its config file will appear in red, meaning it doesn't exist. You can do is to delete it and then "Create a New Initial State" for the BlobTracker in which you can add a Color Bin for the tracking and set the required attributes.

A config file for a pink/red color tracking will look like this:

<?xml version="1.0" encoding="utf-8"?>
<BlobTrackerState xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="https://schemas.microsoft.com/robotics/2007/03/blobtracker.html">
  <ColorBins>
  <ColorBin>
  <Name>red</Name>
  <RedMin>50</RedMin>
  <RedMax>256</RedMax>
  <GreenMin>0</GreenMin>
  <GreenMax>10</GreenMax>
  <BlueMin>0</BlueMin>
  <BlueMax>10</BlueMax>
  </ColorBin>
  </ColorBins>
  <TimeStamp>2007-04-04T11:04:49.2070614-07:00</TimeStamp>
</BlobTrackerState>

Step 4: Start and Run the Tutorial

Start the DSS Command Prompt from the Start > All Programs menu.

Start a DssHost node and create an instance of the service by typing the following command:

dsshost /port:50000 /t:50001 /manifest:"samples\config\RoboticsTutorial7.manifest.xml"

Be sure all configuration files are in the same folder as the manifest, if not they will not be found and the service won't work correctly.

This starts the service and now you can use speech/xinput controller to drive the robot and if you say "Follow me", the robot will track the specify color specified in the config file of the Blob Tracker. Saying "Stop"/"Follow me" stops the "follow me" service.

Summary

In this tutorial, you learned how to:

  • Create Partners.
  • Create Event Handlers.
  • Using Manifest and Config Files.
  • Start and Run the Tutorial.

 

 

© 2012 Microsoft Corporation. All Rights Reserved.