Web Browser

Roboflow provides JavaScript packages for deploying computer vision models in web browsers.

inference-sdk

Uses WebRTC to run inference on your video streams with the Roboflow Cloud and return live results with minimal latency.

  • For live video streams

  • When running Roboflow Workflows, or a model not supported on inferencejs

  • On latency-sensitive use cases with compute-heavy models

Learn more about inference-sdk

inferencejs

Uses Tensorflow.js to run inference on your images and video streams on-device.

  • For images and live video streams

  • On latency-sensitive use cases with light, supported models

  • When internet access isn't continuously available (still required for initial load)

Learn more about inferencejs

Comparison

Feature
inference-sdk
inferencejs

Processing Location

Roboflow Cloud

Browser (On Device)

Processing Latency

Consistent (GPU accelerated)

Dependent on user's device

Model Support

All Roboflow models & Roboflow Workflows

Device Support

Wide (WebRTC is widely supported)

Wide (Tensorflow.js is widely supported)

Internet Required

Yes, continuously

Yes, for initial model load only

Network Latency

Minimal

No network latency

Last updated

Was this helpful?