Class InferenceModelConfiguration

    Definition

    Namespace:
    Tizen.Multimedia.Vision
    Assembly:
    Tizen.Multimedia.Vision.dll

    Represents a configuration of FaceDetector, FacialLandmarkDetector, ImageClassifier and ObjectDetector.

    public class InferenceModelConfiguration : EngineConfiguration
    Inheritance
    object
    EngineConfiguration
    InferenceModelConfiguration
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    Constructors

    View Source

    InferenceModelConfiguration()

    Initializes a new instance of the InferenceModelConfiguration class.

    Declaration
    public InferenceModelConfiguration()
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    Properties

    View Source

    Backend

    Gets or sets the inference model's backend engine.

    Declaration
    public InferenceBackendType Backend { get; set; }
    Property Value
    Type Description
    InferenceBackendType
    Remarks

    The default backend type is OpenCV

    See Also
    SupportedBackend
    View Source

    CategoryFilePath

    Gets or sets the path of inference model's category file.

    Declaration
    public string CategoryFilePath { get; set; }
    Property Value
    Type Description
    string
    Remarks

    This value should be set to use ImageClassifier or ObjectDetector.

    View Source

    ConfidenceThreshold

    Gets or sets the threshold of confidence.

    Declaration
    public double ConfidenceThreshold { get; set; }
    Property Value
    Type Description
    double
    Remarks

    The vaild range is greater than or equal to 0.0 and less than or equal to 1.0.
    The value 1.0 means maximum accuracy.

    View Source

    ConfigurationFilePath

    Gets or sets the path of inference model's configuration data file.

    Declaration
    public string ConfigurationFilePath { get; set; }
    Property Value
    Type Description
    string
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    View Source

    DataType

    Gets or sets the type of data used for inference model.

    Declaration
    public InferenceDataType DataType { get; set; }
    Property Value
    Type Description
    InferenceDataType
    Remarks

    For example, this value should be set to Float32 for a model data supporting float32.
    Float32 will be used internally if a user doesn't set the value.

    View Source

    Device

    Gets or sets the processor type for inference models.

    Declaration
    public InferenceTargetDevice Device { get; set; }
    Property Value
    Type Description
    InferenceTargetDevice
    Remarks

    The default device is CPU.
    If a device doesn't support GPU and Custom, CPU will be used internally, despite the user's choice.

    View Source

    InputNodeName

    Gets or sets the name of an input node

    Declaration
    public string InputNodeName { get; set; }
    Property Value
    Type Description
    string
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    View Source

    MaxOutputNumber

    Gets or sets the maximum output number of detection or classification.

    Declaration
    public int MaxOutputNumber { get; set; }
    Property Value
    Type Description
    int
    Remarks

    The input value over 10 will be set to 10 and the input value under 1 will be set to 1.
    This value can be used to decide the size of Roi, it's length should be the same.

    View Source

    MeanValue

    Gets or sets the inference model's mean value.

    Declaration
    public double MeanValue { get; set; }
    Property Value
    Type Description
    double
    Remarks

    It should be greater than or equal to 0.

    View Source

    MetadataFilePath

    Gets or sets the path of inference model's metadata file.

    Declaration
    public string MetadataFilePath { get; set; }
    Property Value
    Type Description
    string
    Remarks

    This value should be set to use ImageClassifier or ObjectDetector.

    View Source

    OutputNodeName

    Gets or sets the name of an output node

    Declaration
    public IList<string> OutputNodeName { get; set; }
    Property Value
    Type Description
    IList<><string>
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    View Source

    Roi

    Gets or sets the ROI(Region Of Interest) of ImageClassifier and FacialLandmarkDetector

    Declaration
    public Rectangle? Roi { get; set; }
    Property Value
    Type Description
    Rectangle?
    Remarks

    Default value is null. If Roi is null, the entire region of MediaVisionSource will be analyzed.

    See Also
    MaxOutputNumber
    View Source

    StdValue

    Gets or sets the inference model's STD(Standard deviation) value.

    Declaration
    public double StdValue { get; set; }
    Property Value
    Type Description
    double
    Remarks

    It should be greater than or equal to 0.

    View Source

    SupportedBackend

    Gets the list of inference backend engine which is supported in the current device.

    Declaration
    public IEnumerable<InferenceBackendType> SupportedBackend { get; }
    Property Value
    Type Description
    IEnumerable<><InferenceBackendType>

    If there's no supported backend, empty collection will be returned.

    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    View Source

    TensorChannels

    Gets or sets the number of inference model's tensor channel.

    Declaration
    public int TensorChannels { get; set; }
    Property Value
    Type Description
    int
    Remarks

    For example, for RGB colorspace this value should be set to 3
    It should be greater than 0.

    View Source

    TensorSize

    Gets or sets the size of inference model's tensor.

    Declaration
    public Size TensorSize { get; set; }
    Property Value
    Type Description
    Size
    Remarks

    Both width and height of tensor should be greater than 0.
    'Size(-1, -1) is allowed when the intention is to use original image source size as TensorSize.

    View Source

    WeightFilePath

    Gets or sets the path of inference model's weight file.

    Declaration
    public string WeightFilePath { get; set; }
    Property Value
    Type Description
    string
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    Methods

    View Source

    Dispose(bool)

    Releases the resources used by the InferenceModelConfiguration object.

    Declaration
    protected override void Dispose(bool disposing)
    Parameters
    Type Name Description
    bool disposing

    true to release both managed and unmanaged resources, otherwise false to release only unmanaged resources.

    Overrides
    EngineConfiguration.Dispose(bool)
    Remarks

    'Inference model' means pre-learned data, which is represented by ConfigurationFilePath and WeightFilePath, CategoryFilePath.
    If user want to use tizen default inference model and its related value, Please refer Tizen guide page(https://developer.tizen.org/development/guides/.net-application).

    View Source

    LoadInferenceModel()

    Loads inference model data and its related attributes.

    Declaration
    public void LoadInferenceModel()
    Remarks

    Before calling this method, user should set all properties which is required by each inference model.
    The properties set after calling this method will not be affected in the result.

    Exceptions
    Type Condition
    FileFormatException

    Invalid data type is used in inference model data.

    Extension Methods

    EXamlExtensions.LoadFromEXamlByRelativePath<T>(T, string)
    Extensions.LoadFromXaml<TXaml>(TXaml, string)
    Extensions.LoadFromXaml<TXaml>(TXaml, Type)
    Extensions.LoadFromXamlFile<TXaml>(TXaml, string)
    • View Source
    Back to top Copyright © 2016-2025 Samsung
    Generated by DocFX