Behind every message in a thread, Coral Console can show additional context about how the agent reached its conclusion. You can inspect this in the Telemetry dialog.Documentation Index
Fetch the complete documentation index at: https://docs.coralos.ai/llms.txt
Use this file to discover all available pages before exploring further.
Setup
If enabled in your Console build, open any existing thread, select the ellipsis on the top‑right of a message, and click “View Telemetry”.Note: The Telemetry dialog may be feature‑gated in some builds. If you don’t see “View Telemetry” in the message menu, update to the latest Console or enable the feature in your build.
Why use Telemetry
The insights available for a message depend on the underlying provider/format used by the agent (e.g., OpenAI vs. Anthropic). Payloads can be quite verbose; the Telemetry dialog organizes them into readable sections. Data inside these payloads can include:- Agent tooling usage
- Tooling responses
- Errors
- Thread complexity
- Hyperparameters
- Temperature
- Max tokens
- Preamble (system prompt)
- Resources (documents provided to the agent)
- Tool descriptions
- Agent capabilities
Tabs in the Telemetry dialog
- Message Events: Chronological view of message parts (user/assistant/tool/developer) and tool calls (JSON arguments/results).
- Thread Details:
model_description, preamble/system prompt, and resources in scope. - Hyperparameters: Temperature, max_tokens, and additional parameters from the provider.
Server telemetry and exporters
Coral Server supports attaching structured telemetry to individual messages. The primary format is an OpenAI‑compatible representation of chat events (system/user/assistant/tool), mirroring the Chat Completions schema.- OpenAI‑compatible payloads: Messages are represented as
system,user,assistant, andtoolentries with content blocks. This mirrors the OpenAI Chat API structure so you can export existing logs with minimal transformation. - Message‑scoped storage: Telemetry is associated with a specific message inside a thread, which is why the Console can open telemetry directly from a message menu.
Posting telemetry via API
If you want to enrich messages with telemetry from your own runtimes or tools, use the Telemetry API:POST /api/v1/telemetry/{sessionId}— attach telemetry to a messageGET /api/v1/telemetry/{sessionId}/{threadId}/{messageId}— fetch telemetry for a message
- The Console’s Telemetry dialog consumes the same stored payloads. You can post additional events (e.g., tool call args/results) as the message progresses.
- Exact model classes live under
Telemetry.ktin Coral Server; see the project wiki for the latest schema overview.