Annotation Insights

Roboflow provides statistics on annotations associated with your workspace and projects. You can view annotation insights in the Roboflow dashboard and through the REST API.

On August 1st, 2023, we improved how annotation metrics are tracked. The Annotation Insights v2 endpoint provides annotation data from August 1st, 2023 and later.

To retrieve annotation insights for a workspace, make a GET request to the following endpoint:

https://api.roboflow.com/${WORKSPACE}/stats

This endpoint accepts the following URL parameters:

  • api_key: API key for the workspace from which to retrieve statistics.

  • startDate: Retrieve statistics starting from this date. Accepts a date in YYYY-MM-DD format: 2023-08-01

  • endDate: Retrieve statistics ending from this date. Accepts a date in YYYY-MM-DD format: 2023-08-15.

  • projectId: Retrieve data only for the specified project.

  • userId: Retrieve data only for the specified user.

Example response:

{
    "data": [
        {
            "approved": 73,
            "boxesDrawn": 127,
            "imagesLabeled": 73,
            "projectId": "projectId123",
            "projectName": "My CV Project",
            "markedNull": 8,
            "modelAssisted": 1,
            "rejected": 0,
            "labelerId": "labelerId123",
            "workspaceId": "workspaceId123",
            "approvalRate": 100
        }
    ],
    "labelers": [
        {
            "displayName": "Lenny",
            "email": "[email protected]",
            "id": "labelerId123"
        }
    ],
    "stats": {
        "numImagesLabeled": 73,
        "numBoxesDrawn": 127,
        "numImagesMarkedNull": 8,
        "totalImagesUsingModelAssist": 1,
        "approvalRate": 100
    }
}

Annotation Insights Data Structure

This endpoint returns a payload with the following structure:

  • data: Per-labeler metrics grouped by project. Each object represents one labeler's activity on a single project.

    • projectId: ID of the project (from session.datasetId).

    • projectName: Project name, resolved via getProjectsByIds.

    • projectType: Type of project (e.g., "object-detection").

    • labelerId: Unique ID of the labeler.

    • workspaceId: ID of the workspace this session belongs to.

    • imagesLabeled: Count of images where the labeler created, edited, deleted annotations, or marked them null.

    • boxesDrawn: Net number of annotations created (equals boxesAdded - boxesRemoved).

    • markedNull: Number of images explicitly marked as null by the labeler.

    • modelAssisted: Count of images where model assist was used.

    • approvalRate: Approval percentage for this labeler/project pairing.

    • netBoxesAdded: boxesAdded - boxesRemoved.

    • netBoxesUpdated: Currently equivalent to boxesUpdated (included for consistency).

    • boxesAdded: Total number of annotations created.

    • boxesRemoved: Total number of annotations deleted.

    • boxesUpdated: Total number of annotations edited.

  • labelers: Metadata for each labeler ID present in data.

    • Derived from unique labelerIds.

    • id: Labeler’s user ID.

    • displayName: Name from the user profile.

    • email: Email address.

      • For system labelers (e.g., autolabelservice), returns placeholder values.

  • meta: Supplementary metadata.

    • notices: Array of disclaimers.

      • Currently includes a single item noting that metrics are available only after August 1, 2023.

  • stats: Aggregated workspace-level totals across all sessions.

    • numImagesLabeled: Total count of unique labeled images.

    • numBoxesDrawn: Net annotations created across all sessions.

    • numImagesMarkedNull: Images marked null at least once.

    • totalImagesUsingModelAssist: Images labeled with model assistance.

    • numBoxesAdded: Total number of annotations created.

    • numBoxesRemoved: Total number of annotations deleted.

    • numBoxesUpdated: Total number of annotations edited.

    • netBoxesAdded: numBoxesAdded - numBoxesRemoved.

    • netBoxesUpdated: numBoxesUpdated (included for naming symmetry).

    • approvalRate: Aggregate approval percentage across all labelers and projects.

Because data[].imagesLabeled , in the per-labeler counts, records each session separately, the same image can be counted more than once if it was labeled on multiple days or by multiple labelers. The overall stats.numImagesLabeled field, however, is derived from combinedData and counts each unique image only once across the date range. This difference often explains why the total image count from the API may not exactly match what the UI displays, depending on which metric the UI uses (per-session vs. unique images).

Last updated

Was this helpful?