On this page
Server Reference
Anton CoWork API.
A local FastAPI service that wraps Anton's Python core and exposes it as an OpenAI-compatible HTTP surface — plus the cowork sidecar (projects, attachments, artifacts, schedules, integrations) that powers the desktop app. The server runs alongside the renderer, scoped to loopback, with no external auth.
Overview
All resource endpoints live under /v1/*. Two unversioned endpoints sit at the root
for liveness checks. Every cowork-specific request carries an optional project field
that scopes work to a project folder; when omitted, the active project from
GET /v1/projects/active is used.
Override the host and port with environment variables before launching:
# start the server ANTON_SERVER_HOST=127.0.0.1 ANTON_SERVER_PORT=26866 python server/main.py
CORS is locked to the renderer dev origin (http://localhost:5173),
the packaged-app origin (app://-), and an optional VITE_RENDERER_URL.
The desktop app talks to this server in-process — there is no external entry point.
Health
config_ready, provider, live_conversations, live_pads.
anton_available. Useful as a connectivity probe.
# sample response { "status": "ok", "anton_available": true, "mode": "anton", "config_ready": true, "provider": "anthropic", "model": "claude-sonnet-4-6", "provider_label": "Anthropic", "live_conversations": [], "live_pads": [] }
Responses API
POST /v1/responses is the headline endpoint — a near-drop-in for OpenAI's Responses API
with cowork extensions for projects and attachments. Streaming is the default;
pass stream: false for one-shot JSON. The first event of a streamed response
carries the conversation_id the desktop app uses as its task id.
Request
POST /v1/responses Content-Type: application/json { "model": "anton", // optional; "anton" uses the configured default "input": "Summarize this quarter's roadmap", "stream": true, // SSE; false → ResponseObject JSON "conversation": "task-2025-Q3", // existing id, or omit to mint one "project": "acme-engineering", // optional; defaults to active project "attachment_ids": ["att_1a2b…"] // uploaded via /v1/attachments/* }
Streamed response
event: response.created data: {"id":"resp-…", "conversation_id":"task-2025-Q3"} event: response.in_progress data: {"thought_role":"thought.scratchpad.start", "label":"Reading roadmap.md"} event: response.output_text.delta data: {"delta":"Q3 ships three things…"} event: response.completed data: {"id":"resp-…", "output_text":"Q3 ships three things…"}
Streaming events
The five top-level SSE events follow the OpenAI Responses contract. Anton's tool activity
rides on response.in_progress frames, tagged by thought_role so clients
can render an inline progress trail without parsing free-form text.
thought_role.output_text.code + error.Thought roles
result carries a preview.end includes the hits.Conversations
Conversations are the persistent backbone of every task. They survive restarts, carry the full message history with attached thought events, and can be reassigned to other projects.
{ "title"?, "project"? }.Projects
Projects are folders on disk. Each project owns its conversations, attachments, memory, and generated artifacts. Exactly one project is active at a time; new tasks default to it.
{ "name": "…" }.{ "name": "…" }.{ "name": "new-name" }.Settings
Provider selection, default model, greeting, and other renderer-visible knobs.
Settings are written to ~/.anton/.env and reloaded by the running server.
config_ready / config_error.Attachments
Attachments are files that augment a turn.
Upload first, then pass the returned ids in attachment_ids on
POST /v1/responses. Attachments are scoped to a project and attach to a
conversation as soon as one exists.
project or session_id.ready, processing, or failed.Artifacts
Artifacts are files Anton generates during a task — reports, charts, generated code, dashboards.
They live in the project's .anton/artifacts directory and can be previewed
in the renderer or revealed in the OS file manager.
Scratchpad
Anton's sandboxed Python runtime, exposed for direct use by the desktop app and external clients. Each named scratchpad is a long-lived kernel with its own venv. Cells stream stdout / stderr / display data over SSE.
Memory
Long-term memory is per-project. Anton calls /memory internally during recall
and memorize phases; you can also seed or audit it directly.
{ name, description, type, content, project? }.Skills
Datasources
Direct access to the data vault — credentials for connectors Anton can use during scratchpad runs. Validation calls the engine's connection probe without persisting.
These endpoints back the legacy Connect Apps and Data form. The chat-driven
credential workflow lives at /v1/datavault/submissions
and is described end-to-end in the Data Vault docs.
{ engine, name, params }.Datavault forms
The chat-driven credential collection workflow. Anton emits a form via the
request_credentials tool; the renderer renders it; the user submits to
this endpoint, which streams a server-side agent's verdict back as SSE — same shape
as /v1/responses, so the existing client adapter consumes it natively.
The full architecture (agent, headless probe, patch dialect, UI) is on its own page — see Data Vault docs.
data-vault-form-patch blocks. Body: { form_id, conversation_id?, values, skipped, form_spec? }.
fetch_submission tool reads them.
Submit request
POST /v1/datavault/submissions Content-Type: application/json { "form_id": "fm_a3f9c2b41e", "conversation_id": "task-2025-Q3", "values": { "api_key": "phx_…" }, "skipped": [], "form_spec": { /* full spec the renderer holds, with auth_method spread in for multi-method */ } }
Streamed response
event: response.created data: {"id":"resp-…", "model":"datavault-agent"} event: response.output_text.delta data: {"delta":"Trying to connect to **posthog**…"} event: response.output_text.delta data: {"delta":"\n\n```data-vault-form-patch\n{...}\n```\n\n"} event: response.in_progress data: {"thought_role":"thought.scratchpad.start"} event: response.completed data: {"id":"resp-…", "response":{"status":"success"}}
response.completed.response.status is one of success,
retry, needs_input, or failed. The renderer doesn't
have to act on it — every meaningful state change rides on a
data-vault-form-patch block that flips the form to its appropriate UI.
Publish
4nton.ai; returns the share URL.Pins
Pinned tasks — the sidebar's "Pinned" rail. Pins are reorderable and capture a recent-visit hint so the renderer can surface "where you left off."
{ kind, id, title }.{ ids: [...] }.autoPin is enabled.Schedules
Cron-style automation. Each schedule fires a prompt against a project + model on its cadence; the resulting conversation is saved like any other task.
{ cron, prompt, project?, model? }.Search
q.Browse
Integrations
Third-party connectors. OAuth flows complete server-side; the renderer polls
GET /v1/integrations for the latest status.
Error shape
FastAPI's default HTTPException shape, with a stable status code and a
human-readable detail string. Streaming endpoints emit response.failed with
the same status mapped into the code field.
{ "detail": "Anton is not installed in this desktop environment." }
400— invalid request body or missing required field.404— project, conversation, schedule, or pin not found.409— name collision (project create, schedule create).503— Anton is not installed in this environment, or the configured provider is unreachable.