Share via


Create the Detection Application

This content is no longer actively maintained. It is provided as is, for anyone who may still be using these technologies, with no warranties or claims of accuracy with regard to the most recent product version or service release.

After you create the grammar that the answering machine detection application uses, the next step is to lay out the dialog flow and add code for the various handlers.

Creating the Dialog Flow

Because an answering machine detection application uses outbound calling, you need to update the default voice response application template.

To set up the dialog flow

  1. Drag a Code activity onto the design surface, drop it above the AnswerCall activity, and then change the name to startUp.

  2. Right-click the AnswerCall activity, and then click Delete.

  3. Drag a MakeCall activity onto the design surface, drop it below the Code activity, and then change the name to makeCall.

  4. Drag a DetectAnsweringMachine activity onto the design surface, drop it below the MakeCall activity, and then change the name to detectAnsweringMachine.

  5. Drag an IfElse activity onto the design surface, drop it below the DetectAnsweringMachine activity, and then change the name to distinguishCalledParty.

  6. Right-click one of the IfElseBranch activities inside the IfElse activity, and then click Copy.

  7. Right-click the IfElse activity, and then click Paste.

    There should be three IfElseBranch activities inside the IfElse activity.

  8. Change the names of the three IfElseBranch activities to ifAnswerMachine, ifHuman, and ifNeither, viewing them left to right.

  9. Drag a Statement activity into each of the three IfElseBranch activities, and then rename the Statement activities to detectedAnswerMachine, detectedHuman, and detectedNothing, viewing them left to right.

    Note

    A real-world application probably needs to do more than merely play a Statement activity.

At this point, the dialog flow should appear similar to that in the following illustration.

Bb813493.586f4b6f-876b-4e42-aa12-f2ae0cbdae42(en-us,office.12).jpg

Modifying Outbound.aspx to Pass the Outbound Number

A small change must be made in Outbound.aspx so that it can pass the outbound number to the application.

To modify Outbound.aspx to pass the outbound number

  1. In Solution Explorer, double-click Outbound.aspx.

  2. In the TriggerButton_Click method, edit the definition of the outboundUri variable so that OutboundPhoneNumber is set to the SIP address of the target phone.

    The SIP address appears in one of two forms: an IP address or a fully qualified domain name.

    An example using an IP address is OutboundPhoneNumber=sip:123@255.255.1.1.

    An example using a fully qualified domain name is OutboundPhoneNumber=sip:user@user2.contosa.com.

Add Backing Code for Handlers and Code Conditions

Now that the dialog flow is laid out and Outbound.aspx is ready to pass the outbound phone number, define variables and add initialization code for the Code activity, define code conditions for the IfElseBranch activities, and add handlers for the TurnStarting event for the DetectAnsweringMachine and Statement activities.

Variables

_grxmlPath holds the location of the grammar file and _grammarRule holds the name of the primary rule in this grammar. If your grammar file is at a different location, ensure that _grxmlPath contains the correct location. _outboundNumber holds the outbound number to be called. The last three variables hold the recognition result (the text of what the person or answering machine spoke), the detection result (a value that indicates that the call was to a person or an answering machine), and the message result (a value that indicates whether an attempt was made to leave a message, and if so, whether the attempt succeeded).

A good location for these variable definitions is in VoiceResponseWorkflow1.cs, right at the beginning of the body of the Workflow1 class and just above the definition of the Workflow1 constructor.

    private string _grxmlPath = @"D:\Program Files\Microsoft Office Communications Server 2007 Speech Server\SDK\Speech Projects\DetectAnsMach\DetectAnsMach\Grammars\Responses.grxml";
    private string _grammarRule = "Rule1";
    private string _outboundNumber = string.Empty;
    private RecognitionResult _recoResult = null;
    private DetectionResult _detectionResult = DetectionResult.None;
    private MessageResult _messageResult = MessageResult.None;

Two more items that must be addressed are setting the detection grammar on detectAnsweringMachine and registering a handler for the Closed event on this activity.A good location for these statements is in VoiceResponseWorkflow1.cs in the definition for the Workflow1 constructor, right after the call to InitializeComponent().

    this.detectAnsweringMachine.Grammar = new Grammar(new Uri(_grxmlPath), _grammarRule);
    this.detectAnsweringMachine.Closed += new System.EventHandler<System.Workflow.ComponentModel.ActivityExecutionStatusChangedEventArgs>(this.detectAnsweringMachine_Closed);

Initialization Code

The following code initializes properties for makeCall, in preparation for answering machine detection. This code checks that the query string sent from Outbound.aspx is not null and contains a key named OutboundPhoneNumber that itself is not null. If these conditions are met, the code initializes the CalledPartyProperty and CallingParty properties on makeCall.

To add the initialization code

  1. On the design surface, right-click startUp, and then click Generate Handlers.

    This causes VoiceResponseWorkflow1.cs to open at the OnEntry method.

  2. Add the following code in the body of OnEntry.

          if (QueryString == null || QueryString["OutboundPhoneNumber"] == null)
          {
            TelephonySession.LoggingManager.LogApplicationError(100, "Survey application - OutboundPhoneNumber parameter not specified.");
            throw new Exception("Survey application - OutboundPhoneNumber parameter not specified.");
          }
          this._outboundNumber = Uri.UnescapeDataString(QueryString["OutboundPhoneNumber"]);
          SipUri calledPartyUri = new SipUri(this._outboundNumber);
          SipUri callingPartyUri = new SipUri("sip:user@user1.contosa.com");
          this.makeCall.CalledParty = new SipUriTelephonyAddress(calledPartyUri);
          this.makeCall.CallingParty = new SipUriTelephonyAddress(callingPartyUri);
    

Code Conditions

The next step is to set up the logic that controls which of the three IfElseBranch activity branches is used, based on the detection result from detectAnsweringMachine.

To set the code condition for ifAnswerMachine

  1. Select the left-most IfElseBranch activity, ifAnswerMachine.

  2. In the Properties window for ifAnswerMachine, click Condition, and then click Declarative Rule Condition in the Value box.

  3. Expand the Condition property.

  4. In the ConditionName box, enter AnsMachine.

  5. In Expression, click the browse (...) button to the right of Condition Expression.

  6. In the Rule Condition Editor box, enter the following text.

    this._detectionResult == Microsoft.SpeechServer.Dialog.DetectionResult.AnsweringMachine
    

Setting the code condition for ifHuman is almost identical to that in the previous procedure.

To set the code condition for ifHuman

  1. Select the center IfElseBranch activity, ifHuman.

  2. In the Properties window for ifHuman, click Condition, and then click Declarative Rule Condition in the Value box.

  3. Expand the Condition property.

  4. In the ConditionName box, enter Human.

  5. In Expression, click the browse (...) button to the right of Condition Expression.

  6. In the Rule Condition Editor box, enter the following text.

    this._detectionResult == Microsoft.SpeechServer.Dialog.DetectionResult.LivePerson 
    

The right-most IfElseBranch activity, ifNeither, is used when neither of the preceding branches is used. If everything is working correctly, this branch is not used, but it is useful in preventing spurious detection results from appearing for either of the first two branches.

TurnStarting Event Handlers

The focus is to identify the TurnStarting and Closed event handlers for detectAnsweringMachine and the three Statement activities.

In the following procedure, the Message property on DetectAnsweringMachineActivity is set. If this activity determines that it has reached an answering machine, the activity plays the contents of Message. The message does not play if the called party is determined to be a human.

To complete the TurnStarting handler for detectAnsweringMachine

  1. On the design surface, select detectAnsweringMachine, the DetectAnsweringMachine activity.

  2. Right-click detectAnsweringMachine, and then click Generate Handlers.

    This causes VoiceResponseWorkflow1.cs to open at the detectAnsweringMachine_TurnStarting method.

  3. Add the following text to the body of this method.

    this.detectAnsweringMachine.Message.SetText("Sorry we missed you");
    

The following procedure sets the text for the main prompt in detectedAnswerMachine, the Statement activity that executes if an answering machine is reached.

To complete the TurnStarting handler for detectedAnswerMachine

  1. On the design surface, select detectedAnswerMachine, the Statement activity in the ifAnswerMachine branch.

  2. Right-click detectedAnswerMachine, and then click Generate Handlers.

    This causes VoiceResponseWorkflow1.cs to open at the detectedAnswerMachine_TurnStarting method.

  3. Add the following text to the body of this method.

    this.detectedAnswerMachine.MainPrompt.SetText("Answering machine detected");
    

The procedures for completing the TurnStarting handlers for detectedHuman and detectedNothing are nearly identical.

To complete the TurnStarting handler for detectedHuman

  1. On the design surface, select detectedHuman, the Statement activity in the ifHuman branch.

  2. Right-click detectedHuman, and then click Generate Handlers.

    This causes VoiceResponseWorkflow1.cs to open at the detectedHuman_TurnStarting method.

  3. Add the following text to the body of this method.

    this.detectedHuman.MainPrompt.SetText("Human detected");
    

To complete the TurnStarting handler for detectedNothing

  1. On the design surface, select detectedNothing, the Statement activity in the ifNeither branch.

  2. Right-click detectedNothing, and then click Generate Handlers.

    This causes VoiceResponseWorkflow1.cs to open at the detectedNothing_TurnStarting method.

  3. Add the following text to the body of this method.

    this.detectedNothing.MainPrompt.SetText("Detected nothing");
    

For the final step, add a handler for the Closed event on detectAnsweringMachine. Because there is no support in the for this event you will need to add the entire method body in the code-beside. The following procedure provides the details.

To add a Closed handler for detectAnsweringMachine

  • In VoiceResponseWorkflow1.cs (the code-beside), add the following code. One place this handler can be inserted is right after the last TurnStarting event handler.

    private void detectAnsweringMachine_Closed(object sender, ActivityExecutionStatusChangedEventArgs e)
    {
      _recoResult = this.detectAnsweringMachine.ClassificationRecognitionResult;
      _detectionResult = this.detectAnsweringMachine.DetectionResult;
      _messageResult = this.detectAnsweringMachine.MessageResult;
    }
    

VoiceResponseWorkflow1.cs

When you have completed the preceding procedures, VoiceResponseWorkflow1.cs should appear as follows. Note that using statements for the Microsoft.SpeechServer.Recognition and Microsoft.SpeechServer namespaces have been added, but were not discussed in the preceding procedures.

using System;
using System.ComponentModel;
using System.ComponentModel.Design;
using System.Collections;
using System.Diagnostics;
using System.Drawing;
using System.Workflow.ComponentModel.Compiler;
using System.Workflow.ComponentModel.Serialization;
using System.Workflow.ComponentModel;
using System.Workflow.ComponentModel.Design;
using System.Workflow.Runtime;
using System.Workflow.Activities;
using System.Workflow.Activities.Rules;
using Microsoft.SpeechServer.Dialog;
using Microsoft.SpeechServer.Recognition;
using Microsoft.SpeechServer;

namespace DetectAnswerMachine
{
  
  public sealed partial class Workflow1: SpeechSequentialWorkflowActivity
  {
    private string _grxmlPath = @"D:\Program Files\Microsoft Office Communications Server 2007 Speech Server\SDK\Speech Projects\DetectAnsMach\DetectAnsMach\Grammars\Responses.grxml";
    private string _grammarRule = "Rule1";
    private string _outboundNumber = string.Empty;
    RecognitionResult _recoResult = null;
    DetectionResult _detectionResult = DetectionResult.None;
    MessageResult _messageResult = MessageResult.None;

    public Workflow1()
    {
      InitializeComponent();
      this.detectAnsweringMachine.Grammar = new Grammar(new Uri(_grxmlPath), _grammarRule);
    this.detectAnsweringMachine.Closed += new System.EventHandler<System.Workflow.ComponentModel.ActivityExecutionStatusChangedEventArgs>(this.detectAnsweringMachine_Closed);
    }

    //This method is called when any exception other than a CallDisconnectedException occurs within the workflow.
    //It does some generic exception logging, and is provided for convenience during debugging;
    //you should replace or augment this with your own error handling code.
    //CallDisconnectedExceptions are handled separately; see HandleCallDisconnected, below.
    private void HandleGeneralFault(object sender, EventArgs e)
    {
      //The fault property is read only.  When an exception is thrown the actual exception is 
      //stored in this property.  Check this value for error information.
      string errorMessage = this.generalFaultHandler.Fault.Message;
      
      if(Debugger.IsAttached)
      {
        //If the debugger is attached, break here
        //so that you can see the error that occurred.
        //(Check the errorMessage variable above.)
        Debugger.Break();
      }

      //Write the error to both the NT Event Log and the .etl file,
      //so that some record is kept even if the debugger is not attached.
      TelephonySession.LoggingManager.LogApplicationError(
                       50000,  //the first parameter is an event id,
                               // chosen arbitrarily.
                               //Speech Server uses various IDs below 50000,
                               // so you might want to
                               //use IDs above this number to avoid overlap.
                       "An exception occurred in the Speech workflow with Id" +
                       this.WorkflowInstanceId +
                       ".  The exception was:\n" +
                       this.generalFaultHandler.Fault);

      //Dump a detailed version of the most recent Speech Server logs to the .etl file
      //(see the MMC for your current Speech Server logging settings, including the location of this file)
      this.TelephonySession.LoggingManager.DumpTargetedLogs();
    }


    //This method is called when a CallDisconnectedException occurs in the workflow.
    //This happens when a speech activity tried to run while the call is not connected,
    //and can happen for two reasons:
    //(1) The user hung up on the app in the middle of the flow.  This is expected and normal.
    //(2) You disconnected the call locally, and then attempted to run a speech activity.
    //    This is an application bug.
    private void HandleCallDisconnected(object sender, EventArgs e)
    {
      if (Debugger.IsAttached)
      {
        //If you just hung up on the app, ignore this breakpoint.
        Debugger.Break();
      }
    }

    private void OnEntry(object sender, EventArgs e)
    {
      if (QueryString == null || QueryString["OutboundPhoneNumber"] == null)
      {
        TelephonySession.LoggingManager.LogApplicationError(100, "Survey application - OutboundPhoneNumber parameter not specified.");
        throw new Exception("Survey application - OutboundPhoneNumber parameter not specified.");
      }
      this._outboundNumber = Uri.UnescapeDataString(QueryString["OutboundPhoneNumber"]);
      SipUri calledPartyUri = new SipUri(this._outboundNumber);
      SipUri callingPartyUri = new SipUri("sip:user@user2.contosa.com");
      this.makeCall.CalledParty = new SipUriTelephonyAddress(calledPartyUri);
      this.makeCall.CallingParty = new SipUriTelephonyAddress(callingPartyUri);
    }


    private void detectAnsweringMachine_TurnStarting(object sender, TurnStartingEventArgs e)
    {
      this.detectAnsweringMachine.Message.SetText("Sorry we missed you.");

    }

    private void detectedAnswerMachine_TurnStarting(object sender, TurnStartingEventArgs e)
    {
      this.detectedAnswerMachine.MainPrompt.SetText("Answering machine detected");
    }

    
    private void detectedHuman_TurnStarting(object sender, TurnStartingEventArgs e)
    {
      this.detectedHuman.MainPrompt.SetText("Human detected");
    }

    private void detectedNothing_TurnStarting(object sender, TurnStartingEventArgs e)
    {
      this.detectedNothing.MainPrompt.SetText("Detected nothing");
    }

    private void detectAnsweringMachine_Closed(object sender, ActivityExecutionStatusChangedEventArgs e)
    {
      _recoResult = this.detectAnsweringMachine.ClassificationRecognitionResult;
      _detectionResult = this.detectAnsweringMachine.DetectionResult;
      _messageResult = this.detectAnsweringMachine.MessageResult;
    }
    
  }
}

Next Step

Run the Detection Application