Web Browser
Roboflow provides JavaScript packages for deploying computer vision models in web browsers.
inference-sdk
Uses WebRTC to run inference on your video streams with the Roboflow Cloud and return live results with minimal latency.
For live video streams
When running Roboflow Workflows, or a model not supported on
inferencejsOn latency-sensitive use cases with compute-heavy models
inferencejs
Uses Tensorflow.js to run inference on your images and video streams on-device.
For images and live video streams
On latency-sensitive use cases with light, supported models
When internet access isn't continuously available (still required for initial load)
Comparison
Processing Location
Roboflow Cloud
Browser (On Device)
Processing Latency
Consistent (GPU accelerated)
Dependent on user's device
Device Support
Wide (WebRTC is widely supported)
Wide (Tensorflow.js is widely supported)
Internet Required
Yes, continuously
Yes, for initial model load only
Network Latency
Minimal
No network latency
Last updated
Was this helpful?