# Annotation Insights

Roboflow provides statistics on annotations associated with your workspace and projects. You can view annotation insights in the Roboflow dashboard and through the REST API.

<mark style="color:blue;">On August 1st, 2023, we improved how annotation metrics are tracked. The Annotation Insights v2 endpoint provides annotation data from August 1st, 2023 and later.</mark>

{% tabs %}
{% tab title="REST API" %}
To retrieve annotation insights for a workspace, make a GET request to the following endpoint:

```url
https://api.roboflow.com/${WORKSPACE}/stats
```

This endpoint accepts the following URL parameters:

| Parameter   | Description                                                                            | Required |
| ----------- | -------------------------------------------------------------------------------------- | -------- |
| `api_key`   | <a href="authenticate-with-the-rest-api" class="button primary">API Authentication</a> | Yes      |
| `startDate` | Start date in `YYYY-MM-DD` format. Data available from `2023-08-01` onward.            | Yes      |
| `endDate`   | End date in `YYYY-MM-DD` format.                                                       | Yes      |
| `project`   | Project slug (dataset URL) to filter results.                                          | Optional |
| `userId`    | User ID to filter results.                                                             | Optional |

Example response:

```json
{
    "data": [
        {
            "approved": 73,
            "boxesDrawn": 127,
            "imagesLabeled": 73,
            "projectId": "projectId123",
            "projectName": "My CV Project",
            "markedNull": 8,
            "modelAssisted": 1,
            "rejected": 0,
            "labelerId": "labelerId123",
            "workspaceId": "workspaceId123",
            "approvalRate": 100
        }
    ],
    "labelers": [
        {
            "displayName": "Lenny",
            "email": "lenny@roboflow.foo",
            "id": "labelerId123"
        }
    ],
    "stats": {
        "numImagesLabeled": 73,
        "numBoxesDrawn": 127,
        "numImagesMarkedNull": 8,
        "totalImagesUsingModelAssist": 1,
        "approvalRate": 100
    }
}
```

{% endtab %}
{% endtabs %}

## Annotation Insights Data Structure

This endpoint returns a payload with the following structure:

* `data`: Per-labeler metrics grouped by project. Each object represents one labeler's activity on a single project.
  * `projectId`: ID of the project (from `session.datasetId`).
  * `projectName`: Project name, resolved via `getProjectsByIds`.
  * `projectType`: Type of project (e.g., `"object-detection"`).
  * `labelerId`: Unique ID of the labeler.
  * `workspaceId`: ID of the workspace this session belongs to.
  * `imagesLabeled`: Count of images where the labeler created, edited, deleted annotations, or marked them null.
  * `boxesDrawn`: Net number of annotations created (equals `boxesAdded - boxesRemoved`).
  * `markedNull`: Number of images explicitly marked as null by the labeler.
  * `modelAssisted`: Count of images where model assist was used.
  * `approvalRate`: Approval percentage for this labeler/project pairing.
  * `netBoxesAdded`: Total of the number of new boxes created within each session (see note below)
  * `netBoxesUpdated`: Currently equivalent to `boxesUpdated` (included for consistency).
  * `boxesAdded`: Total number of annotations created.
  * `boxesRemoved`: Total number of annotations deleted.
  * `boxesUpdated`: Total number of annotations edited.
* `labelers`: Metadata for each labeler ID present in `data`.
  * Derived from unique `labelerId`s.
  * `id`: Labeler’s user ID.
  * `displayName`: Name from the user profile.
  * `email`: Email address.
    * For system labelers (e.g., `autolabelservice`), returns placeholder values.
* `meta`: Supplementary metadata.
  * `notices`: Array of disclaimers.
    * Currently includes a single item noting that metrics are available only after August 1, 2023.
* `stats`: Aggregated workspace-level totals across all sessions.
  * `numImagesLabeled`: Total count of unique labeled images.
  * `numBoxesDrawn`: Net annotations created across all sessions.
  * `numImagesMarkedNull`: Images marked null at least once.
  * `totalImagesUsingModelAssist`: Images labeled with model assistance.
  * `numBoxesAdded`: Total number of annotations created.
  * `numBoxesRemoved`: Total number of annotations deleted.
  * `numBoxesUpdated`: Total number of annotations edited.
  * `netBoxesAdded`: `numBoxesAdded - numBoxesRemoved`.
  * `netBoxesUpdated`: `numBoxesUpdated` (included for naming symmetry).
  * `approvalRate`: Aggregate approval percentage across all labelers and projects.

{% hint style="info" %}
Because `data[].imagesLabeled` , in the per-labeler counts, records each session separately, the same image can be counted more than once if it was labeled on multiple days or by multiple labelers. The overall `stats.numImagesLabeled` field, however, is derived from `combinedData` and counts each unique image only once across the date range. This difference often explains why the total image count from the API may not exactly match what the UI displays, depending on which metric the UI uses (per-session vs. unique images).
{% endhint %}

{% hint style="info" %}
`netBoxesAdded`: For each session, Roboflow counts boxes added (boxes that are present at the time the session ends; if a box is added, then removed before the session ends, it will not be included in the count) within that one session. The `netBoxesAdded` stat totals the net box additions (they do not match later deletions). It therefore equals the sum of “Total of the number of new boxes created within each session,” not the dataset’s final box count.
{% endhint %}
