Web inference-sdk
Run realtime video inference from your browser, running on the Roboflow cloud, with inference-sdk
What is WebRTC Streaming?
@roboflow/inference-sdk enables real-time video streaming from your browser to Roboflow's inference servers using WebRTC. This allows you to:
Execute Workflows - Run complex multi-step computer vision pipelines
Access All Models - Use any Roboflow model type
Server-Side Processing - Leverage powerful GPUs
Low Latency - WebRTC provides near-real-time results
Bidirectional Communication - Send and receive data during streaming
Installation
npm install @roboflow/inference-sdkQuick Start
Take a look at the video/sample code below to get started:
🔐 Security Best Practices
NEVER expose your API key in frontend code for production applications.
The connectors.withApiKey() method is convenient for demos but exposes your API key in the browser. For production, always use a backend proxy:
Secure Production Pattern
Frontend:
Backend (Express):
Key Features
Dynamic Output Reconfiguration
Change stream and data outputs at runtime without restarting:
Complete Working Example
For a full working example with both frontend and backend code, see the sample application repository. The sample app demonstrates:
Proper backend proxy setup for API key security
Camera streaming integration
Error handling and connection management
Production-ready patterns
Resources
Last updated
Was this helpful?