ImageDev

TextureClassificationApply

Classifies all pixels of an image using a trained texture model.

Access to parameter description

For an introduction: For all the pixels of the input image, the algorithm extracts the texture features selected in the classification model at the training step, and computes the distance to each classes of the model.

The pixel is classified as belonging to the closest class and the corresponding label is assigned to it in the output label image.
The distances are stored in an uncertainty output image in accordance with a metric defined by the outputMapType parameter.

See also

Function Syntax

This function returns a TextureClassificationApplyOutput structure containing outputLabelImage and outputMapImage.
// Output structure of the textureClassificationApply function.
struct TextureClassificationApplyOutput
{
    /// The output label image representing the texture classification result. Its dimensions and type are forced to the same values as the training input.
    std::shared_ptr< iolink::ImageView > outputLabelImage;
    /// The output map image. Its dimensions are forced to the same values as the training image. In CLASS_DISTANCE mode, its number of channels is equal to the number of classes defined in the training image. Its data type is forced to floating point.
    std::shared_ptr< iolink::ImageView > outputMapImage;
};

// Function prototype
TextureClassificationApplyOutput textureClassificationApply( std::shared_ptr< iolink::ImageView > inputImage, TextureClassificationModel::Ptr inputModel, TextureClassificationApply::OutputMapType outputMapType, std::shared_ptr< iolink::ImageView > outputLabelImage = nullptr, std::shared_ptr< iolink::ImageView > outputMapImage = nullptr );
This function returns a tuple containing output_label_image and output_map_image.
// Function prototype.
texture_classification_apply(input_image: idt.ImageType,
                             input_model: Union[Any, None] = None,
                             output_map_type: TextureClassificationApply.OutputMapType = TextureClassificationApply.OutputMapType.CLOSEST_DISTANCE,
                             output_label_image: idt.ImageType = None,
                             output_map_image: idt.ImageType = None) -> Tuple[idt.ImageType, idt.ImageType]
This function returns a TextureClassificationApplyOutput structure containing outputLabelImage and outputMapImage.
/// Output structure of the TextureClassificationApply function.
public struct TextureClassificationApplyOutput
{
    /// 
    /// The output label image representing the texture classification result. Its dimensions and type are forced to the same values as the training input.
    /// 
    public IOLink.ImageView outputLabelImage;
    /// 
    /// The output map image. Its dimensions are forced to the same values as the training image. In CLASS_DISTANCE mode, its number of channels is equal to the number of classes defined in the training image. Its data type is forced to floating point.
    /// 
    public IOLink.ImageView outputMapImage;
};

// Function prototype.
public static TextureClassificationApplyOutput
TextureClassificationApply( IOLink.ImageView inputImage,
                            TextureClassificationModel inputModel = null,
                            TextureClassificationApply.OutputMapType outputMapType = ImageDev.TextureClassificationApply.OutputMapType.CLOSEST_DISTANCE,
                            IOLink.ImageView outputLabelImage = null,
                            IOLink.ImageView outputMapImage = null );

Class Syntax

Parameters

Parameter Name Description Type Supported Values Default Value
input
inputImage
The input grayscale image. Image Grayscale nullptr
input
inputModel
The input texture classification model, previously trained. TextureClassificationModel nullptr
input
outputMapType
The type of uncertainty map image to compute.
CLOSEST_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification. The closer to 0 this metric is, the more confident the classification.
RELATIVE_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification (d1) weighted by the gap with the second closest distance (d2). The smaller this metric is, the more confident and less ambiguous the classification. $$ Map Value = log \left( \frac {d1}{d2 - d1} \right) $$
CLASS_DISTANCE The uncertainty map is a multichannel image where each channel represents the distance to the corresponding class.
NONE No uncertainty map is computed.
Enumeration CLOSEST_DISTANCE
output
outputLabelImage
The output label image representing the texture classification result. Its dimensions and type are forced to the same values as the training input. Image nullptr
output
outputMapImage
The output map image. Its dimensions are forced to the same values as the training image. In CLASS_DISTANCE mode, its number of channels is equal to the number of classes defined in the training image. Its data type is forced to floating point. Image nullptr
Parameter Name Description Type Supported Values Default Value
input
input_image
The input grayscale image. image Grayscale None
input
input_model
The input texture classification model, previously trained. TextureClassificationModel None
input
output_map_type
The type of uncertainty map image to compute.
CLOSEST_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification. The closer to 0 this metric is, the more confident the classification.
RELATIVE_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification (d1) weighted by the gap with the second closest distance (d2). The smaller this metric is, the more confident and less ambiguous the classification. $$ Map Value = log \left( \frac {d1}{d2 - d1} \right) $$
CLASS_DISTANCE The uncertainty map is a multichannel image where each channel represents the distance to the corresponding class.
NONE No uncertainty map is computed.
enumeration CLOSEST_DISTANCE
output
output_label_image
The output label image representing the texture classification result. Its dimensions and type are forced to the same values as the training input. image None
output
output_map_image
The output map image. Its dimensions are forced to the same values as the training image. In CLASS_DISTANCE mode, its number of channels is equal to the number of classes defined in the training image. Its data type is forced to floating point. image None
Parameter Name Description Type Supported Values Default Value
input
inputImage
The input grayscale image. Image Grayscale null
input
inputModel
The input texture classification model, previously trained. TextureClassificationModel null
input
outputMapType
The type of uncertainty map image to compute.
CLOSEST_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification. The closer to 0 this metric is, the more confident the classification.
RELATIVE_DISTANCE The uncertainty map represents the Mahalanobis distance to the class selected by the classification (d1) weighted by the gap with the second closest distance (d2). The smaller this metric is, the more confident and less ambiguous the classification. $$ Map Value = log \left( \frac {d1}{d2 - d1} \right) $$
CLASS_DISTANCE The uncertainty map is a multichannel image where each channel represents the distance to the corresponding class.
NONE No uncertainty map is computed.
Enumeration CLOSEST_DISTANCE
output
outputLabelImage
The output label image representing the texture classification result. Its dimensions and type are forced to the same values as the training input. Image null
output
outputMapImage
The output map image. Its dimensions are forced to the same values as the training image. In CLASS_DISTANCE mode, its number of channels is equal to the number of classes defined in the training image. Its data type is forced to floating point. Image null

Object Examples

auto classification_input = readVipImage( std::string( IMAGEDEVDATA_IMAGES_FOLDER ) + "classification_input.vip" );
TextureClassificationModel::Ptr modelToApply= TextureClassificationModel::read( std::string( IMAGEDEVDATA_OBJECTS_FOLDER ) + "modelToApply.vip" );

TextureClassificationApply textureClassificationApplyAlgo;
textureClassificationApplyAlgo.setInputImage( classification_input );
textureClassificationApplyAlgo.setInputModel( modelToApply );
textureClassificationApplyAlgo.setOutputMapType( TextureClassificationApply::OutputMapType::CLOSEST_DISTANCE );
textureClassificationApplyAlgo.execute();

std::cout << "outputLabelImage:" << textureClassificationApplyAlgo.outputLabelImage()->toString();
std::cout << "outputMapImage:" << textureClassificationApplyAlgo.outputMapImage()->toString();
classification_input = imagedev.read_vip_image(imagedev_data.get_image_path("classification_input.vip"))
model_to_apply = imagedev.TextureClassificationModel.read(imagedev_data.get_object_path("modelToApply.vip"))

texture_classification_apply_algo = imagedev.TextureClassificationApply()
texture_classification_apply_algo.input_image = classification_input
texture_classification_apply_algo.input_model = model_to_apply
texture_classification_apply_algo.output_map_type = imagedev.TextureClassificationApply.CLOSEST_DISTANCE
texture_classification_apply_algo.execute()

print("output_label_image:", str(texture_classification_apply_algo.output_label_image))
print("output_map_image:", str(texture_classification_apply_algo.output_map_image))
ImageView classification_input = Data.ReadVipImage( @"Data/images/classification_input.vip" );
TextureClassificationModel modelToApply = TextureClassificationModel.Read( @"Data/objects/modelToApply.vip" );

TextureClassificationApply textureClassificationApplyAlgo = new TextureClassificationApply
{
    inputImage = classification_input,
    inputModel = modelToApply,
    outputMapType = TextureClassificationApply.OutputMapType.CLOSEST_DISTANCE
};
textureClassificationApplyAlgo.Execute();

Console.WriteLine( "outputLabelImage:" + textureClassificationApplyAlgo.outputLabelImage.ToString() );
Console.WriteLine( "outputMapImage:" + textureClassificationApplyAlgo.outputMapImage.ToString() );

Function Examples

auto classification_input = readVipImage( std::string( IMAGEDEVDATA_IMAGES_FOLDER ) + "classification_input.vip" );
TextureClassificationModel::Ptr modelToApply= TextureClassificationModel::read( std::string( IMAGEDEVDATA_OBJECTS_FOLDER ) + "modelToApply.vip" );

auto result = textureClassificationApply( classification_input, modelToApply, TextureClassificationApply::OutputMapType::CLOSEST_DISTANCE );

std::cout << "outputLabelImage:" << result.outputLabelImage->toString();
std::cout << "outputMapImage:" << result.outputMapImage->toString();
classification_input = imagedev.read_vip_image(imagedev_data.get_image_path("classification_input.vip"))
model_to_apply = imagedev.TextureClassificationModel.read(imagedev_data.get_object_path("modelToApply.vip"))

result_output_label_image, result_output_map_image = imagedev.texture_classification_apply(classification_input, model_to_apply, imagedev.TextureClassificationApply.CLOSEST_DISTANCE)

print("output_label_image:", str(result_output_label_image))
print("output_map_image:", str(result_output_map_image))
ImageView classification_input = Data.ReadVipImage( @"Data/images/classification_input.vip" );
TextureClassificationModel modelToApply = TextureClassificationModel.Read( @"Data/objects/modelToApply.vip" );

Processing.TextureClassificationApplyOutput result = Processing.TextureClassificationApply( classification_input, modelToApply, TextureClassificationApply.OutputMapType.CLOSEST_DISTANCE );

Console.WriteLine( "outputLabelImage:" + result.outputLabelImage.ToString() );
Console.WriteLine( "outputMapImage:" + result.outputMapImage.ToString() );