Broadcasting IM Text Based on Speech Recognition in a UCMA Application: Creating the UCMA Application (Part 2 of 3)

Summary:   Combine speech recognition, Microsoft Lync 2010 Automation API, and Microsoft Unified Communications Managed API (UCMA) 3.0 to broadcast urgent text to clients and customers.

Applies to:   Microsoft Unified Communications Managed API (UCMA) 3.0 Core SDK | Microsoft Lync 2010 SDK | Microsoft Speech Platform SDK

Published:   November 2011 | Provided by:   John Clarkson, Microsoft | About the Author

Contents

This article is the second in a three-part series of articles about how to use speech recognition to broadcast instant messaging text from a Microsoft Unified Communications Managed API (UCMA) 3.0 application.

This application uses a helper class to perform platform setup and to create a user endpoint. For more information, see %ProgramFiles%\Microsoft UCMA 3.0\SDK\Core\Sample Applications\QuickStarts\Common\UCMASampleHelper.cs.

To perform platform and endpoint setup

  1. Create a helper class object.

    private UCMASampleHelper _helper;
    _helper = new UCMASampleHelper();
    
  2. Initialize and start up the platform, and create a user endpoint.

    // Create a user endpoint using the network credential object. 
    _userEndpoint = _helper.CreateEstablishedUserEndpoint("Broadcast User");
    
    // Register a delegate to be called when an incoming audio/video call arrives.
    _userEndpoint.RegisterForIncomingCall<AudioVideoCall>(AudioVideoCall_Received);
    

This application features use of the SpeechRecognitionEngine class, provided by the Microsoft Speech Platform SDK, and the SpeechRecognitionConnector and SpeechRecognitionStream classes, provided by the UCMA 3.0 Core SDK.

Figure 1. AudioVideo namespace and Recognition namespace classes

Object model

To set up speech recognition within the UCMA application

  1. Create a SpeechRecognitionConnector and attach an AudioVideoFlow object.

    SpeechRecognitionConnector speechRecognitionConnector = new SpeechRecognitionConnector();
    speechRecognitionConnector.AttachFlow(_audioVideoFlow);
    
  2. Start the connector.

    SpeechRecognitionStream stream = speechRecognitionConnector.Start();
    
  3. Create a SpeechRecognitionEngine object.

    SpeechRecognitionEngine speechRecognitionEngine = new SpeechRecognitionEngine();
    speechRecognitionEngine.SpeechRecognized += new EventHandler<SpeechRecognizedEventArgs>(SpeechRecognitionEngine_SpeechRecognized);
    
  4. Attach a grammar to the speech recognition engine.

    Grammar gr = new Grammar(@"C:\YourGrammar.grxml", "Main");
    speechRecognitionEngine.LoadGrammarAsync(gr);
    
  5. Attach the speech recognition engine to the audio stream.

    SpeechAudioFormatInfo speechAudioFormatInfo = new SpeechAudioFormatInfo(8000, AudioBitsPerSample.Sixteen, Microsoft.Speech.AudioFormat.AudioChannel.Mono);
    speechRecognitionEngine.SetInputToAudioStream(stream, speechAudioFormatInfo);
    
  6. Prepare the speech recognition engine to perform recognition.

    speechRecognitionEngine.RecognizeAsync(RecognizeMode.Multiple);
    

To support speech recognition, define a grammar. The grammar defines what the speech recognition engine recognizes as spoken input. This application uses an XML grammar file as input to the Grammar class that is provided by the Microsoft Speech Platform SDK. For more information, see the example that appears in Broadcasting IM Text Based on Speech Recognition in a UCMA Application: Code Listing (Part 3 of 3).

When the caller speaks, the audio stream is sent to the UCMA 3.0 application, where it is processed by the SpeechRecognitionEngine instance. If the utterance matches the recognition engine’s grammar, the SpeechRecognized event is raised. The following example is the handler for the SpeechRecognized event. This handler was registered in step 3 in the previous procedure.

void SpeechRecognitionEngine_SpeechRecognized(object sender, SpeechRecognizedEventArgs e)
{
  RecognitionResult result = e.Result;
  if (result != null)
  {
    Console.WriteLine("Speech recognized: " + result.Text);

    imMessageContent = result.Text;
    if (result.Text.Contains("send"))
    {
      SendIM();       //Call the method to send IM messages.
      _waitForSendMethodCompleted.WaitOne();
      _waitForConnectorToStop.Set();
    }
  }
}

The event returns a RecognitionResult object that can be accessed through the e parameter (of type SpeechRecognizedEventArgs) of the event handler, from the expression e.Result.RecognitionResult.Text. The Text property returns a string, which is parsed into an array, which is used to build the IM text.

string ImMessageText()
{
  // Parse the recognition return
  char[] delimiterCharacters = {' '};
  string[] words = imMessageContent.Split(delimiterCharacters);

  // Grab array elements
  string msgTXT = "Hi this is your broker Kate Berger with an urgent recommendation to ";
  string operation = words[0];
  string name = words[1];
  string minORmax = words[2];
  string price = words[3];

  if (operation == "buy")
  {
    msgTXT = msgTXT + "BUY " + name + " with a " + minORmax + " value of " + price + ".";
  }

  else
  {
    msgTXT = msgTXT + "SELL " + name + "with a " + minORmax + " value of " + price + ".";
  }

  return msgTXT;
        }

This application uses the BeginStartConversation method from the Lync SDK to send the message. A generic List collection is used to store the URIs of message recipients. In this case there are only two recipients, but there is no explicit limit to the number of URIs that can be addressed.

To store the URIs of message recipients

  1. Create an instance of the generic List class.

    System.Collections.Generic.List<string> inviteeList = new System.Collections.Generic.List<string>();
    
  2. Use the ConfigurationManager class to obtain recipient URIs from the App.config file.

    inviteeList.Add(ConfigurationManager.AppSettings["UserURI"]);
    inviteeList.Add(ConfigurationManager.AppSettings["UserURI2"]);
    

To send the instant message

  1. Get an Automation object.

    Automation automation;
    automation = LyncClient.GetAutomation();
    
  2. Specify message settings.

    System.Collections.Generic.Dictionary<AutomationModalitySettings, object> mSettings = new System.Collections.Generic.Dictionary<AutomationModalitySettings, object>();
    string messageText = ImMessageText();
    mSettings.Add(AutomationModalitySettings.FirstInstantMessage, messageText);
    mSettings.Add(AutomationModalitySettings.SendFirstInstantMessageImmediately, true);
    
  3. Broadcast the instant messages.

    IAsyncResult ar = automation.BeginStartConversation(AutomationModalities.InstantMessage, inviteeList, mSettings, null, null);
    cWindow = automation.EndStartConversation(ar);
    

John Clarkson is a programming writer with the Microsoft Lync product team.

Show: