Roboflow Docs
DashboardResourcesProducts
  • Product Documentation
  • Developer Reference
  • Changelog
  • Roboflow Documentation
  • Quickstart
  • Workspaces
    • Workspaces, Projects, and Models
    • Create a Workspace
    • Rename a Workspace
    • Delete a Workspace
  • Team Members
    • Invite a Team Member
    • Role-Based Access Control (RBAC)
    • Change a Team Member Role
    • Remove a Team Member
  • Single Sign On (SSO)
  • Workflows
    • What is Workflows?
    • Create a Workflow
    • Build a Workflow
    • Test a Workflow
    • Deploy a Workflow
    • Workflow Examples
      • Multimodal Model Workflow
    • Share a Workflow
    • Workflows AI Assistant
  • Enterprise Integrations
  • Workflow Blocks
    • Run a Model
      • Object Detection Model
      • Single-Label Classification Model
    • Visualize Predictions
      • Bounding Box Visualization
      • Label Visualization
      • Circle Visualization
      • Background Color Visualization
      • Classification Label Visualization
      • Crop Visualization
  • Dataset Management
    • Create a Project
    • Upload Images, Videos, and Annotations
      • Import Data from Cloud Providers
        • AWS S3 Bucket
        • Azure Blob Storage
        • Google Cloud Storage
      • Import from Roboflow Universe
    • Manage Datasets
      • Dataset Batches
      • Search a Dataset
      • Set Dataset Classes
      • Add Tags to Images
      • Create an Annotation Attribute
      • Download an Image
      • Delete an Image
    • Dataset Versions
      • Create a Dataset Version
      • Preprocess Images
      • Image Augmentation
        • Add Augmentations to Images
      • Delete a Version
    • Dataset Analytics
    • Merge Projects
    • Rename a Project
    • Delete a Project
    • Project Folders
    • Make a Project Public
    • Download a Dataset
  • Annotate
    • Introduction to Roboflow Annotate
    • Annotate an Image
      • Keyboard Shortcuts
      • Comment on an Image
      • Annotation History
      • Similarity Search
    • AI Labeling
      • Label Assist
      • Enhanced Smart Polygon with SAM
        • Smart Polygon (Legacy)
      • Box Prompting
      • Auto Label
    • Set Keypoint Skeletons
    • Annotate Keypoints
    • Annotate Multimodal Data
    • Collaborate on Labeling
    • Annotation Insights
  • Roboflow Labeling Services
  • Train
    • Train a Model
      • Train from a Universe Checkpoint
      • Train from Azure Vision
      • Train from Google Cloud
    • Roboflow Instant
    • Cancel a Training Job
    • Stop Training Early
    • View Training Results
    • View Trained Models
    • Evaluate Trained Models
  • Deploy
    • Deploy a Model or Workflow
    • Supported Models
    • Managed Deployments
    • Serverless Hosted API V2
      • Use in a Workflow
      • Use with the REST API
      • Run an Instant Model
    • Serverless Hosted API
      • Object Detection
      • Classification
      • Instance Segmentation
        • Semantic Segmentation
      • Keypoint Detection
      • Foundation Models
        • CLIP
        • OCR
        • YOLO-World
      • Video Inference
        • Use a Fine-Tuned Model
        • Use CLIP
        • Use Gaze Detection
        • API Reference
        • Video Inference JSON Output Format
      • Pre-Trained Model APIs
        • Blur People API
        • OCR API
        • Logistics API
        • Image Tagging API
        • People Detection API
        • Fish Detection API
        • Bird Detection API
        • PPE Detection API
        • Barcode Detection API
        • License Plate Detection API
        • Ceramic Defect Detection API
        • Metal Defect Detection API
    • Dedicated Deployments
      • Create a Dedicated Deployment
      • Make Requests to a Dedicated Deployment
      • Manage Dedicated Deployments with an API
    • Batch Processing
    • SDKs
      • Python inference-sdk
      • Web Browser
        • inferencejs Reference
        • inferencejs Requirements
      • Lens Studio
        • Changelog - Lens Studio
      • Luxonis OAK
    • Upload Custom Model Weights
    • Download Model Weights
    • Enterprise Deployment
      • License Server
      • Offline Mode
      • Kubernetes
      • Docker Compose
    • Model Monitoring
      • Alerting
  • Universe
    • What is Roboflow Universe?
    • Find a Dataset on Universe
    • Explore Images in a Universe Dataset
    • Fork a Universe Dataset
    • Find a Model on Universe
    • Download a Universe Dataset
  • Set a Project Description
  • View Project Analytics
  • Support
    • Share a Workspace with Support
    • Delete Your Roboflow Account
    • Apply for Academic Credits
  • Billing
    • Premium Trial
    • Credits
      • View Credit Usage
      • Enable or Disable Flex Billing
      • Purchase Prepaid Credits
    • Plans
      • Purchase a Plan
      • Cancel a Plan
      • Update Billing Details
      • Update Payment Method
      • View Invoices
Powered by GitBook
On this page
  • Accessing Model Monitoring
  • Workspace Dashboard
  • Model Dashboard
  • Inferences Table
  • Inference Details
  • Enabling Inference Images
  • Alerting
  • Custom Metadata
  • Model Monitoring API
  • Supported Deployments

Was this helpful?

  1. Deploy

Model Monitoring

A guide to Model Monitoring with Roboflow.

PreviousDocker ComposeNextAlerting

Last updated 3 months ago

Was this helpful?

Roboflow's Model Monitoring dashboard gives you unparalleled visibility into your models, from prototyping, all the way through production. With Model Monitoring, you can view high-level statistics to get insight into how your models are performing over time, or even view individual inference requests, to see how your models perform on edge cases.

Accessing Model Monitoring

Model Monitoring is only available for select plans. For the latest information, see our

To view your Model Monitoring dashboard, click the "Monitoring" tab in your workspace.

Workspace Dashboard

Immediately, you will see three statistics pertaining to your models:

  • Total requests: The total number of inferences made to all models in your workspace

  • Average confidence: The average confidence across all predictions made by your models.

  • Average inference time: The average inference time across all inferences (The time in seconds it took to produce the predictions including image preprocessing)

The % change values are based on the current period vs the previous period. By default, these statistics will show your data for the last week. However, you can modify the time range using the buttons on top of the statistics.

Model Dashboard

Under the Models tab, you can select a specific model to view its data. There, you'll see the same statistics as the Workspace Overview, but specific to one model.

Here, in addition to the statistics, you can view the number of detections for each class in the model, and see its distribution with respect to other classes.

Inferences Table

Here, you can see all the prediction results for your model. In addition, you will also see any custom metadata that was added to your inferences. To view a subset of your inferences, you can use the filters on the top-right of the table.

Inference Details

From the Inferences Table, you have the ability to drill down into a specific inference and see more details. Let's break it down in the order shown in this image:

  1. Inference Details: On this panel, you can view all the details and properties about your inference request. All available fields are shown by default, but if you want to hide some, you can click the "Cog" icon in the top right corner to hide fields. (This setting will persist on your browser)

  2. On some fields, if available, there will be an option to search for inferences based on that field. On the highlighted example, it will search for inferences from the same model.

  3. Detections: This collapsable pane shows a list of detections received from that inference. You can click on the "Class" and "Confidence" table headers to choose the sort order of the table.

  4. Download & Link buttons: Here, you can download the image associated with the inference or copy a link to this Inference Details for later reference.

Enabling Inference Images

Images saved by Active Learning or Dataset Upload will count the same as uploading an image to your project. Credit, limit or quota usage may apply according to your plan type.

There are two ways to enable inference images to show up in Model Monitoring:

  • Roboflow Dataset Upload block: In Workflows, you can add a "Roboflow Dataset Upload" block. Once you hook up the predictions and prediction image, it will show up in Model Monitoring.

  • Active Learning (legacy): For legacy workspaces, you can enable "Active Learning" rules from your project's page:

Alerting

You and other members of your team can subscribe to real-time alerts when issues or anomalies occur with your model. For example, if the confidence of your model suddenly decreases, or your Inference Server goes down, and your model stops running, your team will receive an email notification.

See more info on the Alerting page:

Custom Metadata

To attach additional metadata to an inference, you can use Model Monitoring's custom metadata feature. Using custom metadata, you can add information to an inference such as the location of where the image was taken, the expected value of the prediction, and so on. Your custom metadata will show up in the "Recent Inferences" and "All Inferences" views.

Model Monitoring API

For automation and integration into external systems, you can pull Model Monitoring statistics using our API for model monitoring.

Supported Deployments

At this time, Model Monitoring does not support inference requests made using the Inference Pipeline, however, we plan to add support in the near future.

The Models table shows all models that have inferences on them and clicking on them will take you to the .

You can also access tabs for viewing Recent Inferences (across all models) and .

Clicking on the "See All Inferences" button at the top right of the table will navigate you to the .

Image: Here, you can see the image that was inferred. Note: This isn't enabled by default. See

To attach custom metadata to an inference result, please see the documentation.

Model Monitoring supports inference requests made using Roboflow's Hosted API or the Roboflow Inference Server, granted the Inference Server has internet access. This includes edge deployments which use Roboflow's .

Alerting
Custom Metadata API
License Server
Model Dashboard
setting Alerts
Inferences Table
Enabling Inference Images
Pricing page