Llm
For instructions on how to authenticate to use this endpoint, see API overview.
Endpoints
POST | |
POST |
Create llm gateway v1 chat completions
Required API key scopes
task:writePath parameters
- project_idstring
Query parameters
- formatstringOne of:
"json""txt"
Request parameters
- modelstring
- messagesarray
- temperaturenumber
- top_pnumber
- ninteger
- streambooleanDefault:
false - stream_options
- stoparray
- max_tokensinteger
- max_completion_tokensinteger
- presence_penaltynumber
- frequency_penaltynumber
- logit_bias
- userstring
- toolsarray
- tool_choice
- parallel_tool_callsboolean
- response_format
- seedinteger
- logprobsboolean
- top_logprobsinteger
- modalitiesarray
- prediction
- audio
- reasoning_effort
- verbosity
- storeboolean
- web_search_options
- functionsarray
- function_call
Response
Example request
POST /api /projects /:project_id /llm_gateway /v1 /chat /completionsExample response
Status 200 Successful response with chat completion
RESPONSE
Status 400 Invalid request parameters
RESPONSE
Status 500 Internal server error
RESPONSE
Create llm gateway v1 messages
Create a message using Anthropic's Claude models. Compatible with Anthropic's Messages API format.
Required API key scopes
task:writePath parameters
- project_idstring
Query parameters
- formatstringOne of:
"json""txt"
Request parameters
- modelstring
- messagesarray
- max_tokensintegerDefault:
4096 - temperaturenumber
- top_pnumber
- top_kinteger
- streambooleanDefault:
false - stop_sequencesarray
- system
- metadata
- thinking
- toolsarray
- tool_choice
- service_tier
Response
Example request
POST /api /projects /:project_id /llm_gateway /v1 /messagesExample response
Status 200 Successful response with generated message
RESPONSE
Status 400 Invalid request parameters
RESPONSE
Status 500 Internal server error
RESPONSE