f1score
It measures the performance of a classification model in terms of the both precision and recall. F1 score can be interpreted (loosely) as the average of precision and recall, where the best value is 1 and the worst is 0.
Attention: Available only with Twin Activate commercial edition.
Syntax
Score = f1score(targets,predictions,average)
Inputs
- targets
- Actual label for each observation.
- predictions
- Predicted value for each observation.
- average
- Averaging strategy in case of multiclass classification. 'micro' (default), 'macro', 'none' are the possible values for average. If 'none' is chosen, per class metric is given as output.
Outputs
- Score
- f1score of the classifier.
Example
Usage of f1score
targets = [0 1 0 1];
predictions = [1 1 1 1];
score1 = f1score(targets, predictions);
score2 = f1score(targets, predictions, 'micro');
score3 = f1score(targets, predictions, 'macro');
score4 = f1score(targets, predictions, 'none');
printf('Micro: %f \n', score1);
printf('Micro: %f \n', score2);
printf('Macro: %f \n', score3);
printf('None : ');
disp(score4);
Micro: 0.666667
Micro: 0.666667
Macro: 0.666667
None :
0.666666667