Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.coralos.ai/llms.txt

Use this file to discover all available pages before exploring further.

Behind every message in a thread, Coral Console can show additional context about how the agent reached its conclusion. You can inspect this in the Telemetry dialog.

Setup

If enabled in your Console build, open any existing thread, select the ellipsis on the top‑right of a message, and click “View Telemetry”.
Note: The Telemetry dialog may be feature‑gated in some builds. If you don’t see “View Telemetry” in the message menu, update to the latest Console or enable the feature in your build.

Why use Telemetry

The insights available for a message depend on the underlying provider/format used by the agent (e.g., OpenAI vs. Anthropic). Payloads can be quite verbose; the Telemetry dialog organizes them into readable sections. Data inside these payloads can include:
  • Agent tooling usage
  • Tooling responses
  • Errors
  • Thread complexity
  • Hyperparameters
Additionally, thread‑level details are also available:
  • Temperature
  • Max tokens
  • Preamble (system prompt)
  • Resources (documents provided to the agent)
  • Tool descriptions
  • Agent capabilities

Tabs in the Telemetry dialog

  • Message Events: Chronological view of message parts (user/assistant/tool/developer) and tool calls (JSON arguments/results).
  • Thread Details: model_description, preamble/system prompt, and resources in scope.
  • Hyperparameters: Temperature, max_tokens, and additional parameters from the provider.

Server telemetry and exporters

Coral Server supports attaching structured telemetry to individual messages. The primary format is an OpenAI‑compatible representation of chat events (system/user/assistant/tool), mirroring the Chat Completions schema.
  • OpenAI‑compatible payloads: Messages are represented as system, user, assistant, and tool entries with content blocks. This mirrors the OpenAI Chat API structure so you can export existing logs with minimal transformation.
  • Message‑scoped storage: Telemetry is associated with a specific message inside a thread, which is why the Console can open telemetry directly from a message menu.

Posting telemetry via API

If you want to enrich messages with telemetry from your own runtimes or tools, use the Telemetry API:
  • POST /api/v1/telemetry/{sessionId} — attach telemetry to a message
  • GET /api/v1/telemetry/{sessionId}/{threadId}/{messageId} — fetch telemetry for a message
Minimal example (attach telemetry):
{
  "targets": {
    "threadId": "thread-abc123",
    "messageId": "msg-001"
  },
  "data": {
    "type": "openai",
    "messages": [
      { "role": "system", "content": [{ "type": "text", "text": "You are a helpful agent." }] },
      { "role": "user", "content": [{ "type": "text", "text": "Summarize quarterly revenue." }] },
      { "role": "assistant", "content": [{ "type": "text", "text": "Working on it…" }] }
    ]
  }
}
Notes:
  • The Console’s Telemetry dialog consumes the same stored payloads. You can post additional events (e.g., tool call args/results) as the message progresses.
  • Exact model classes live under Telemetry.kt in Coral Server; see the project wiki for the latest schema overview.

Enabling and configuration

Telemetry capture is available by default and tied to message lifecycle. The Console UI may be feature‑gated; if the menu entry is hidden, enable Telemetry in your build or update to a recent Console version. For server‑side diagnostics and logs, see the Troubleshooting page’s Logs & Diagnostics section.

Example

Here’s a video showing what the menu would look like when inspecting a thread regarding a brief search on cats.