Verta
Search…
Model performance
Once ground truth is ingested, Verta's monitoring system automatically computes all the standard model performance metrics. The performance metrics can be plotted as a single value summary chart as well as a chart showing metric value over time.

Support for different model types:

The performance metrics computed are based on the types of model being monitored. Currently we offer monitoring support for regression and classification type models.
You can provide the model type information when you are registering a model in Verta. That information is used to automatically compute the relevant performance metrics and populate the default monitoring dashboard.
Given below is the code example to assign a model type to a Registered Model Version using model attribute field:
model_version.add_attributes({
'model_type': "regression",
# 'model_type': "classification",
})
If you don’t provide a model type, the system will default to classification.

Classification Models:

Given below are some of the metrics the system computes out of box for a classification model:
  • Accuracy
  • Precision
  • Recall
  • F1
  • True positives
  • False positives
  • True negatives
  • False negatives
  • True positives rate
  • False positives rate
For classification models, Verta monitoring system computes the following decision charts to help debugging production issues quickly:
  • ROC curve (receiver operating characteristic curve)
  • PR curve (Precision-Recall)
  • Confusion matrix
Note - Binary classfication models are currently supported.

Regression Models:

For regression models, Verta computes several performance metrics out of box and populates the default dashboards.
Given below are the metrics tracked for regression models:
  • Mean Absolute Error (MAE)
  • Mean Squared Error (MSE)
  • Root Mean Squared Error (RMSE)
  • R-Squared