Roboflow Docs
DashboardResourcesProducts
  • Documentation
  • Developer Reference
  • Changelog
  • Developer Tools
  • Authentication
  • Command Line Interface (CLI)
    • Using the CLI
    • Installation and Authentication
    • Download dataset
    • Upload Dataset
    • Run inference
    • Getting help
  • REST API
    • REST API Structure
    • Workspace and Project IDs
    • Workspaces
  • Workspace Image Query
  • Projects
    • Initialize
    • Project Folders API
    • Create
  • Batches
  • Annotation Jobs
  • Images
    • Upload Images
    • Image Details
    • Upload Dataset
    • Upload an Annotation
    • Search
    • Tags
  • Versions
    • Create a Project Version
    • View a Version
  • Train a Model
  • Export Data
  • Inference
  • Annotation Insights
    • Annotation Insights (Legacy Endpoint)
  • Model Monitoring
    • Stats
    • Custom Metadata
  • Python SDK
    • Using the Python SDK
  • iOS SDK
    • Using the iOS SDK
Powered by GitBook
On this page

Was this helpful?

  1. Command Line Interface (CLI)

Run inference

PreviousUpload DatasetNextGetting help

Was this helpful?

The goal of building Computer Vision models is to run "inferences"; i.e. give the computer an image and ask it "tell me what you see!".

You can use the roboflow CLI to run inferences either with the models that you have trained on Roboflow, or with open source models available on .

By running roboflow infer in the command line, the Roboflow CLI will be sending images to the Roboflow API and print the results (predictions) in JSON format.

Let's look at an example using an open source model from Roboflow Universe. This is the dataset - an open source Roboflow project that has a trained model capable of identifying poker cards.

From the URL in the browser:

  • workspaceId="roboflow-100"

  • projectId="poker-cards-cxcvz"

  • version=1

Then if you have a local file with a card image you can run a command like

roboflow infer -c .70 -w roboflow-100 -m poker-cards-cxcvz/1 ~/Downloads/ace.jpg

And it's the same thing if you want to run inference with your own models. Just specify the workspace/project/version for a model that your user has access to.

See all the supported parameters with roboflow infer --help

Roboflow Universe
poker-cards