REST API와 함께 사용

Serverless Hosted API V2에는 모든 모델과 Workflows에 대한 단일 엔드포인트가 있습니다:

https://serverless.roboflow.com
Model Type
Serverless Hosted API V2
Hosted API V1

Object detection, Keypoint detection

https://serverless.roboflow.com

https://detect.roboflow.com

인스턴스 세분화

https://serverless.roboflow.com

https://outline.roboflow.com

분류

https://serverless.roboflow.com

https://classify.roboflow.com

시맨틱 세분화

현재 지원되지 않음

https://segment.roboflow.com

Foundation models (i.e. CLIP, OCR, YOLO-World)

https://serverless.roboflow.com

https://infer.roboflow.com

HTTP로 요청하기

Endpoint to run predefined workflow

post

Checks Roboflow API for workflow definition, once acquired - parses and executes injecting runtime parameters from request body

Path parameters
workspace_namestringRequired
workflow_idstringRequired
Body
api_keyany ofOptional

Roboflow API Key that will be passed to the model during initialization for artifact retrieval

stringOptional
or
nullOptional
excluded_fieldsany ofOptional

List of field that shall be excluded from the response (among those defined in workflow specification)

string[]Optional
or
nullOptional
enable_profilingbooleanOptional

Flag to request Workflow run profiling. Enables Workflow profiler only when server settings allow profiling traces to be exported to clients. Only applies for Workflows definitions saved on Roboflow platform.

Default: false
workflow_idany ofOptional

Optional identifier of workflow

stringOptional
or
nullOptional
use_cachebooleanOptional

Controls usage of cache for workflow definitions. Set this to False when you frequently modify definition saved in Roboflow app and want to fetch the newest version for the request.

Default: true
Responses
200

Successful Response

application/json
post
/{workspace_name}/workflows/{workflow_id}
POST /{workspace_name}/workflows/{workflow_id} HTTP/1.1
Host: 
Content-Type: application/json
Accept: */*
Content-Length: 156

{
  "api_key": "text",
  "inputs": {
    "ANY_ADDITIONAL_PROPERTY": "anything"
  },
  "excluded_fields": [
    "text"
  ],
  "enable_profiling": false,
  "workflow_id": "text",
  "use_cache": true
}
{
  "outputs": [
    {
      "ANY_ADDITIONAL_PROPERTY": "anything"
    }
  ],
  "profiler_trace": [
    {
      "ANY_ADDITIONAL_PROPERTY": "anything"
    }
  ]
}

Python SDK로 요청하기

Python을 사용 중인 경우 Serverless API와 상호 작용하는 가장 편리한 방법은 Inference Python SDK를 사용하는 것입니다.

SDK를 사용하려면 먼저 설치하세요:

pip install inference-sdk

Serverless Hosted API V2에 요청하려면 다음 코드를 사용하세요:

from inference_sdk import InferenceHTTPClient

CLIENT = InferenceHTTPClient(
    api_url="https://serverless.roboflow.com",
    api_key="API_KEY"
)

result = CLIENT.infer(your_image.jpg, model_id="model-id/1")

위에서, 다음을 지정하세요 model ID 이제 GPU TRT 컨테이너가 Docker에서 실행 중입니다. 다른 Ubuntu 터미널을 열어 Docker 컨테이너로 추론 데이터를 보낼 준비를 합니다. 다음을 사용하세요: API key.

Last updated

Was this helpful?