Model Monitoring
A guide to Model Monitoring with Roboflow.
Last updated
Was this helpful?
A guide to Model Monitoring with Roboflow.
Last updated
Was this helpful?
Roboflow's Model Monitoring dashboard gives you unparalleled visibility into your models, from prototyping, all the way through production. With Model Monitoring, you can view high-level statistics to get insight into how your models are performing over time, or even view individual inference requests, to see how your models perform on edge cases.
Model Monitoring is only available for select plans. For the latest information, see our Pricing page
To view your Model Monitoring dashboard, click the "Monitoring" tab in your workspace.
Immediately, you will see three statistics pertaining to your models:
Total requests: The total number of inferences made to all models in your workspace
Average confidence: The average confidence across all predictions made by your models.
Average inference time: The average inference time across all inferences (The time in seconds it took to produce the predictions including image preprocessing)
The % change values are based on the current period vs the previous period. By default, these statistics will show your data for the last week. However, you can modify the time range using the buttons on top of the statistics.
The Models table shows all models that have inferences on them and clicking on them will take you to the Model Dashboard.
You can also access tabs for viewing Recent Inferences (across all models) and setting Alerts.
Under the Models tab, you can select a specific model to view its data. There, you'll see the same statistics as the Workspace Overview, but specific to one model.
Here, in addition to the statistics, you can view the number of detections for each class in the model, and see its distribution with respect to other classes.
Clicking on the "See All Inferences" button at the top right of the table will navigate you to the Inferences Table.
Here, you can see all the prediction results for your model. In addition, you will also see any custom metadata that was added to your inferences. To view a subset of your inferences, you can use the filters on the top-right of the table.
From the Inferences Table, you have the ability to drill down into a specific inference and see more details. Let's break it down in the order shown in this image:
Image: Here, you can see the image that was inferred. Note: This isn't enabled by default. See Enabling Inference Images
Inference Details: On this panel, you can view all the details and properties about your inference request. All available fields are shown by default, but if you want to hide some, you can click the "Cog" icon in the top right corner to hide fields. (This setting will persist on your browser)
On some fields, if available, there will be an option to search for inferences based on that field. On the highlighted example, it will search for inferences from the same model.
Detections: This collapsable pane shows a list of detections received from that inference. You can click on the "Class" and "Confidence" table headers to choose the sort order of the table.
Download & Link buttons: Here, you can download the image associated with the inference or copy a link to this Inference Details for later reference.
Images saved by Active Learning or Dataset Upload will count the same as uploading an image to your project. Credit, limit or quota usage may apply according to your plan type.
There are two ways to enable inference images to show up in Model Monitoring:
Roboflow Dataset Upload block: In Workflows, you can add a "Roboflow Dataset Upload" block. Once you hook up the predictions and prediction image, it will show up in Model Monitoring.
Active Learning (legacy): For legacy workspaces, you can enable "Active Learning" rules from your project's page:
You and other members of your team can subscribe to real-time alerts when issues or anomalies occur with your model. For example, if the confidence of your model suddenly decreases, or your Inference Server goes down, and your model stops running, your team will receive an email notification.
See more info on the Alerting page:
AlertingTo attach additional metadata to an inference, you can use Model Monitoring's custom metadata feature. Using custom metadata, you can add information to an inference such as the location of where the image was taken, the expected value of the prediction, and so on. Your custom metadata will show up in the "Recent Inferences" and "All Inferences" views.
To attach custom metadata to an inference result, please see the Custom Metadata API documentation.
For automation and integration into external systems, you can pull Model Monitoring statistics using our API for model monitoring.
Model Monitoring supports inference requests made using Roboflow's Hosted API or the Roboflow Inference Server, granted the Inference Server has internet access. This includes edge deployments which use Roboflow's License Server.
At this time, Model Monitoring does not support inference requests made using the Inference Pipeline, however, we plan to add support in the near future.