Environments-23
For instructions on how to authenticate to use this endpoint, see API overview.
Endpoints
POST | |
GET | |
PATCH | |
DELETE | |
POST | |
POST | |
POST | |
POST | |
POST | |
GET | |
POST |
Create environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
POST /api /environments /:project_id /llm_analytics /provider_keysExample response
Status 201
Create environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
POST /api /environments /:project_id /llm_analytics /provider_keysExample response
Status 201
Retrieve environments llm analytics provider keys
Required API key scopes
llm_provider_key:readPath parameters
- idstring
- project_idstring
Response
Example request
GET /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 200
Retrieve environments llm analytics provider keys
Required API key scopes
llm_provider_key:readPath parameters
- idstring
- project_idstring
Response
Example request
GET /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 200
Update environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- idstring
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
PATCH /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 200
Update environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- idstring
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
PATCH /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 200
Delete environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- idstring
- project_idstring
Example request
DELETE /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 204 No response body
Delete environments llm analytics provider keys
Required API key scopes
llm_provider_key:writePath parameters
- idstring
- project_idstring
Example request
DELETE /api /environments /:project_id /llm_analytics /provider_keys /:idExample response
Status 204 No response body
Create environments llm analytics provider keys validate
Path parameters
- idstring
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
POST /api /environments /:project_id /llm_analytics /provider_keys /:id /validateExample response
Status 200
Create environments llm analytics provider keys validate
Path parameters
- idstring
- project_idstring
Request parameters
- provider
- namestring
- api_keystring
- set_as_activebooleanDefault:
false
Response
Example request
POST /api /environments /:project_id /llm_analytics /provider_keys /:id /validateExample response
Status 200
Create environments llm analytics summarization
Generate an AI-powered summary of an LLM trace or event.
This endpoint analyzes the provided trace/event, generates a line-numbered text representation, and uses an LLM to create a concise summary with line references.
Summary Format:
- 5-10 bullet points covering main flow and key decisions
- "Interesting Notes" section for failures, successes, or unusual patterns
- Line references in [L45] or [L45-52] format pointing to relevant sections
Use Cases:
- Quick understanding of complex traces
- Identifying key events and patterns
- Debugging with AI-assisted analysis
- Documentation and reporting
The response includes the summary text and optional metadata.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Request parameters
- summarize_type
- modeDefault:
minimal - data
- force_refreshbooleanDefault:
false
Response
Example request
POST /api /environments /:project_id /llm_analytics /summarizationExample response
Status 200
Status 400
Status 403
Status 500
Create environments llm analytics summarization batch check
Check which traces have cached summaries available.
This endpoint allows batch checking of multiple trace IDs to see which ones have cached summaries. Returns only the traces that have cached summaries with their titles.
Use Cases:
- Load cached summaries on session view load
- Avoid unnecessary LLM calls for already-summarized traces
- Display summary previews without generating new summaries
Path parameters
- project_idstring
Request parameters
- trace_idsarray
- modeDefault:
minimal
Response
Example request
POST /api /environments /:project_id /llm_analytics /summarization /batch_checkExample response
Status 200
Status 400
Status 403
Create environments llm analytics text repr
Generate a human-readable text representation of an LLM trace event.
This endpoint converts LLM analytics events ($ai_generation, $ai_span, $ai_embedding, or $ai_trace) into formatted text representations suitable for display, logging, or analysis.
Supported Event Types:
$ai_generation: Individual LLM API calls with input/output messages$ai_span: Logical spans with state transitions$ai_embedding: Embedding generation events (text input → vector)$ai_trace: Full traces with hierarchical structure
Options:
max_length: Maximum character count (default: 2000000)truncated: Enable middle-content truncation within events (default: true)truncate_buffer: Characters at start/end when truncating (default: 1000)include_markers: Use interactive markers vs plain text indicators (default: true)- Frontend: set true for
<<<TRUNCATED|base64|...>>>markers - Backend/LLM: set false for
... (X chars truncated) ...text
- Frontend: set true for
collapsed: Show summary vs full trace tree (default: false)include_hierarchy: Include tree structure for traces (default: true)max_depth: Maximum depth for hierarchical rendering (default: unlimited)tools_collapse_threshold: Number of tools before auto-collapsing list (default: 5)- Tool lists >5 items show
<<<TOOLS_EXPANDABLE|...>>>marker for frontend - Or
[+] AVAILABLE TOOLS: Nfor backend wheninclude_markers: false
- Tool lists >5 items show
include_line_numbers: Prefix each line with line number like L001:, L010: (default: false)
Use Cases:
- Frontend display:
truncated: true, include_markers: true, include_line_numbers: true - Backend LLM context (summary):
truncated: true, include_markers: false, collapsed: true - Backend LLM context (full):
truncated: false
The response includes the formatted text and metadata about the rendering.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Request parameters
- event_type
- data
- options
Response
Example request
POST /api /environments /:project_id /llm_analytics /text_reprExample response
Status 200
Status 400
Status 500
Status 503
Create environments llm analytics translate
Translate text to target language.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Example request
POST /api /environments /:project_id /llm_analytics /translateExample response
Status 201 No response body
Create environments llm analytics translate
Translate text to target language.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Example request
POST /api /environments /:project_id /llm_analytics /translateExample response
Status 201 No response body
List all environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
Query parameters
- limitinteger
- offsetinteger
Response
Example request
GET /api /environments /:project_id /llm_promptsExample response
Status 200
List all environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
Query parameters
- limitinteger
- offsetinteger
Response
Example request
GET /api /environments /:project_id /llm_promptsExample response
Status 200
Create environments llm prompts
Required API key scopes
llm_prompt:writePath parameters
- project_idstring
Request parameters
- namestring
- prompt
- deletedboolean
Response
Example request
POST /api /environments /:project_id /llm_prompts