helpers package
Submodules
helpers.metrics module
- helpers.metrics.compute_classification_metrics(data_test)[source]
compute metrics for just classification
- Parameters
data_test (dict) – dict data with fields ‘labels’, ‘preds’
- Returns
dict with metrics, ‘classifier_all_acc’: classifier accuracy on all data, also returns AUC for preds_proba
- Return type
dict
- helpers.metrics.compute_coverage_v_acc_curve(data_test)[source]
- Parameters
data_test (dict) – dict data with field {‘defers’: defers_all, ‘labels’: truths_all, ‘hum_preds’: hum_preds_all, ‘preds’: predictions_all, ‘rej_score’: rej_score_all, ‘class_probs’: class_probs_all}
- Returns
compute_deferral_metrics(data_test_modified) on different coverage levels, first element of list is compute_deferral_metrics(data_test)
- Return type
data (list)
- helpers.metrics.compute_deferral_metrics(data_test)[source]
_summary_
- Parameters
data_test (dict) – dict data with fields ‘defers’, ‘labels’, ‘hum_preds’, ‘preds’
- Returns
dict with metrics, ‘classifier_all_acc’: classifier accuracy on all data
- Return type
dict
‘human_all_acc’: human accuracy on all data ‘coverage’: how often classifier predicts
helpers.training module
helpers.utils module
- class helpers.utils.AverageMeter[source]
Bases:
object
Computes and stores the average and current value