Create batch

POST /batches

Creates and executes a batch from an uploaded file of requests

application/json

Body Required

  • input_file_id string Required

    The ID of an uploaded file that contains requests for the new batch.

    See upload file for how to upload a file.

    Your input file must be formatted as a JSONL file, and must be uploaded with the purpose batch. The file can contain up to 50,000 requests, and can be up to 100 MB in size.

  • endpoint string Required

    The endpoint to be used for all requests in the batch. Currently /v1/chat/completions, /v1/embeddings, and /v1/completions are supported. Note that /v1/embeddings batches are also restricted to a maximum of 50,000 embedding inputs across all requests in the batch.

    Values are /v1/chat/completions, /v1/embeddings, or /v1/completions.

  • completion_window string Required

    The time frame within which the batch should be processed. Currently only 24h is supported.

    Value is 24h.

  • metadata object | null

    Optional custom metadata for the batch.

    Hide metadata attributes Show metadata attributes object | null

Responses

  • 200 application/json

    Batch created successfully.

    Hide response attributes Show response attributes object
    • id string Required
    • object string Required

      The object type, which is always batch.

      Value is batch.

    • endpoint string Required

      The OpenAI API endpoint used by the batch.

    • errors object
      Hide errors attributes Show errors attributes object
      • object string

        The object type, which is always list.

      • data array[object]
        Hide data attributes Show data attributes object
        • code string

          An error code identifying the error type.

        • message string

          A human-readable message providing more details about the error.

        • param string | null

          The name of the parameter that caused the error, if applicable.

        • line integer | null

          The line number of the input file where the error occurred, if applicable.

    • input_file_id string Required

      The ID of the input file for the batch.

    • completion_window string Required

      The time frame within which the batch should be processed.

    • status string Required

      The current status of the batch.

      Values are validating, failed, in_progress, finalizing, completed, expired, cancelling, or cancelled.

    • The ID of the file containing the outputs of successfully executed requests.

    • The ID of the file containing the outputs of requests with errors.

    • created_at integer Required

      The Unix timestamp (in seconds) for when the batch was created.

    • The Unix timestamp (in seconds) for when the batch started processing.

    • expires_at integer

      The Unix timestamp (in seconds) for when the batch will expire.

    • The Unix timestamp (in seconds) for when the batch started finalizing.

    • The Unix timestamp (in seconds) for when the batch was completed.

    • failed_at integer

      The Unix timestamp (in seconds) for when the batch failed.

    • expired_at integer

      The Unix timestamp (in seconds) for when the batch expired.

    • The Unix timestamp (in seconds) for when the batch started cancelling.

    • The Unix timestamp (in seconds) for when the batch was cancelled.

    • The request counts for different statuses within the batch.

      Hide request_counts attributes Show request_counts attributes object
      • total integer Required

        Total number of requests in the batch.

      • completed integer Required

        Number of requests that have been completed successfully.

      • failed integer Required

        Number of requests that have failed.

    • metadata object | null

      Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long.

POST /batches
curl \
 -X POST https://api.openai.com/v1/batches \
 -H "Authorization: Bearer $ACCESS_TOKEN" \
 -H "Content-Type: application/json" \
 -d '{"input_file_id":"string","endpoint":"/v1/chat/completions","completion_window":"24h","metadata":{"key":"string"}}'
Request example
{
  "input_file_id": "string",
  "endpoint": "/v1/chat/completions",
  "completion_window": "24h",
  "metadata": {
    "key": "string"
  }
}
Response examples (200)
{
  "id": "string",
  "object": "batch",
  "endpoint": "string",
  "errors": {
    "object": "string",
    "data": [
      {
        "code": "string",
        "message": "string",
        "param": "string",
        "line": 42
      }
    ]
  },
  "input_file_id": "string",
  "completion_window": "string",
  "status": "validating",
  "output_file_id": "string",
  "error_file_id": "string",
  "created_at": 42,
  "in_progress_at": 42,
  "expires_at": 42,
  "finalizing_at": 42,
  "completed_at": 42,
  "failed_at": 42,
  "expired_at": 42,
  "cancelling_at": 42,
  "cancelled_at": 42,
  "request_counts": {
    "total": 42,
    "completed": 42,
    "failed": 42
  },
  "metadata": {}
}