Show / Hide Table of Contents

Class TextClassificationModelMetrics

Model level text classification metrics

Inheritance
object
TextClassificationModelMetrics
Inherited Members
object.Equals(object)
object.Equals(object, object)
object.GetHashCode()
object.GetType()
object.MemberwiseClone()
object.ReferenceEquals(object, object)
object.ToString()
Namespace: Oci.AilanguageService.Models
Assembly: OCI.DotNetSDK.Ailanguage.dll
Syntax
public class TextClassificationModelMetrics

Properties

Accuracy

Declaration
[Required(ErrorMessage = "Accuracy is required.")]
[JsonProperty(PropertyName = "accuracy")]
public float? Accuracy { get; set; }
Property Value
Type Description
float?

The fraction of the labels that were correctly recognised .

Remarks

Required

MacroF1

Declaration
[Required(ErrorMessage = "MacroF1 is required.")]
[JsonProperty(PropertyName = "macroF1")]
public float? MacroF1 { get; set; }
Property Value
Type Description
float?

F1-score, is a measure of a model\u2019s accuracy on a dataset

Remarks

Required

MacroPrecision

Declaration
[Required(ErrorMessage = "MacroPrecision is required.")]
[JsonProperty(PropertyName = "macroPrecision")]
public float? MacroPrecision { get; set; }
Property Value
Type Description
float?

Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)

Remarks

Required

MacroRecall

Declaration
[Required(ErrorMessage = "MacroRecall is required.")]
[JsonProperty(PropertyName = "macroRecall")]
public float? MacroRecall { get; set; }
Property Value
Type Description
float?

Measures the model's ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.

Remarks

Required

MicroF1

Declaration
[Required(ErrorMessage = "MicroF1 is required.")]
[JsonProperty(PropertyName = "microF1")]
public float? MicroF1 { get; set; }
Property Value
Type Description
float?

F1-score, is a measure of a model\u2019s accuracy on a dataset

Remarks

Required

MicroPrecision

Declaration
[Required(ErrorMessage = "MicroPrecision is required.")]
[JsonProperty(PropertyName = "microPrecision")]
public float? MicroPrecision { get; set; }
Property Value
Type Description
float?

Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)

Remarks

Required

MicroRecall

Declaration
[Required(ErrorMessage = "MicroRecall is required.")]
[JsonProperty(PropertyName = "microRecall")]
public float? MicroRecall { get; set; }
Property Value
Type Description
float?

Measures the model's ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.

Remarks

Required

WeightedF1

Declaration
[JsonProperty(PropertyName = "weightedF1")]
public float? WeightedF1 { get; set; }
Property Value
Type Description
float?

F1-score, is a measure of a model\u2019s accuracy on a dataset

WeightedPrecision

Declaration
[JsonProperty(PropertyName = "weightedPrecision")]
public float? WeightedPrecision { get; set; }
Property Value
Type Description
float?

Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)

WeightedRecall

Declaration
[JsonProperty(PropertyName = "weightedRecall")]
public float? WeightedRecall { get; set; }
Property Value
Type Description
float?

Measures the model's ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.

In this article
Back to top