precision

It measures the performance of a classification model in terms of the classifier’s ability to not label a negative example as positive. Precision score can be interpreted as the probability that a positive prediction made by the classifier is positive, where the best value is 1 and the worst is 0.

Attention: Available only with Twin Activate commercial edition.

Syntax

Score = precision(targets,predictions,average)

Inputs

targets
Actual label for each observation.
Type: double
Dimension: vector
predictions
Predicted value for each observation.
Type: double
Dimension: vector
average
Averaging strategy in case of multiclass classification. 'micro' (default), 'macro', 'none' are the possible values for average. If 'none' is chosen, per class metric is given as output.
Type: char
Dimension: string

Outputs

Score
Precision score of the classifier.
Type: double
Dimension: scalar | struct (if 'none' is chosen)

Example

Usage of precision

targets = [0, 1, 2, 3, 0, 1, 2, 3];
predictions = [1, 0, 2, 1, 3, 1, 0, 1];
score1 = precision(targets, predictions);
score2 = precision(targets, predictions, 'micro');
score3 = precision(targets, predictions, 'macro');
score4 = precision(targets, predictions, 'none');
printf('Micro: %f \n', score1);
printf('Micro: %f \n', score2);
printf('Macro: %f \n', score3);
printf('None : ');
disp(score4);
Micro: 0.250000 
Micro: 0.250000 
Macro: 0.312500 
None : 
struct [
  0: 0
  1: 0.25
  2: 1
  3: 0
]