One-vs-All Multiclass

 

Updated: October 5, 2017

Creates a multiclass classification model from an ensemble of binary classification models

Category: Machine Learning / Initialize Model / Classification

This article describes how to use the One-Vs-All Multiclass module in Azure Machine Learning Studio to create a classification model that can predict multiple classes.

This classifier must be connected to an existing binary classification algorithm. You configure the two-class model, and then train the combination of models by using Train Model with a labeled training dataset.

When you combine the models, even though the training dataset might have multiple class values, the One-Vs-All Multiclass creates multiple binary classification models, optimizes the algorithm for each class, and then merges the models.

This module is useful for creating models that predict three or more possible outcomes, when the outcome depends on continuous or categorical predictor variables. This method also lets you use binary classification methods for issues that require multiple output classes.

While some classification algorithms permit the use of more than two classes by design, others restrict the possible outcomes to one of two values (a binary, or two-class model. However, even binary classification algorithms can be adapted for multi-class classification tasks using a variety of strategies.

This module implements the one vs. all method, in which a binary model is created for each of the multiple output classes. Each of these binary models for the individual classes is assessed against its complement (all other classes in the model) as though it were a binary classification issue. Prediction is then performed by running these binary classifiers, and choosing the prediction with the highest confidence score.

In essence, an ensemble of individual models is created and the results are then merged, to create a single model that predicts all classes. Thus, any binary classifier can be used as the basis for a one-vs-all model.

For example, let’s say you configure a Two-Class Support Vector Machine model and provide that as input to the One-Vs-All Multiclass module. The module would create two-class support vector machine models for all members of the output class and then apply the one-vs-all method to combine the results for all classes.

  1. Add the One-Vs-All Multiclass to your experiment.

  2. Add one of the two-class classification models to the experiment, and configure that model.

    For example, you might use a Two-Class Support Vector Machine or Two-Class Boosted Decision Tree tree model.

    System_CAPS_ICON_warning.jpg Warning

    Need help choosing the right algorithm? See these resources:

    Note that the One-Vs-All Multiclass classifier has no configurable parameters of its own. Any customizations must be done in the model that is provided as input.

  3. Connect the untrained classifier that is the output of One-Vs-All Multiclass to Train Model.

    On the other input of Train Model, connect a labeled training data set that has multiple class values.

  4. To train the classification model, run the experiment, or select Train Model and click Run Selected.

    Alternatively, you can pass the untrained classifier to Cross-Validate Model for cross-validation against a labeled validation data set.

  5. After the classifier has been trained, you can use the model to make multiclass predictions.

For examples of how this learning algorithm is used, see these sample experiments in the Model Gallery:

NameTypeDescription
Untrained binary classification modelILearner interfaceAn untrained binary classification model
NameTypeDescription
Untrained modelILearner interfaceAn untrained multiclass classification

For a list of all error messages, see Module Error Codes.

ExceptionDescription
Error 0013An exception occurs if the learner that was passed to the module is the wrong type.

Classification
A-Z Module List

Show: