Skip to main content
POST
/
messages
Create message
const options = {
  method: 'POST',
  headers: {Authorization: 'Bearer <token>', 'Content-Type': 'application/json'},
  body: JSON.stringify({
    model: '<string>',
    messages: [{role: 'user', content: '<string>'}],
    max_tokens: 2,
    system: '<string>',
    temperature: 0.5,
    top_p: 0.5,
    top_k: 2,
    tools: [{name: '<string>', description: '<string>', input_schema: {}}],
    tool_choice: {},
    stream: true,
    metadata: {},
    reasoning: {effort: 'medium', summary: 'auto', enabled: true, max_tokens: 1},
    stop_sequences: ['<string>'],
    provider_options: {
      openai: {
        context_management: {type: 'compaction', compact_threshold: 123},
        prompt_cache_retention: '<string>'
      },
      anthropic: {cache_control: {type: '<string>', ttl: '<string>', scope: '<string>'}},
      google: {
        cache_control: {type: '<string>', ttl: '<string>', scope: '<string>'},
        cached_content: '<string>',
        cache_ttl: '<string>'
      }
    },
    usage: true,
    meta: true,
    echo_upstream_request: true,
    debug: {
      enabled: true,
      return_upstream_request: true,
      return_upstream_response: true,
      trace: true,
      trace_level: 'summary'
    },
    provider: {
      order: ['<string>'],
      only: ['<string>'],
      ignore: ['<string>'],
      include_alpha: true
    }
  })
};

fetch('https://api.phaseo.app/v1/messages', options)
  .then(res => res.json())
  .then(res => console.log(res))
  .catch(err => console.error(err));
{
  "id": "<string>",
  "type": "<string>",
  "role": "assistant",
  "model": "<string>",
  "content": [
    {
      "type": "text",
      "text": "<string>",
      "cache_control": {
        "type": "<string>",
        "ttl": "<string>",
        "scope": "<string>"
      },
      "source": {
        "type": "<string>",
        "media_type": "<string>",
        "data": "<string>",
        "url": "<string>"
      },
      "id": "<string>",
      "name": "<string>",
      "input": {},
      "tool_use_id": "<string>",
      "content": "<string>"
    }
  ],
  "stop_reason": "<string>",
  "stop_sequence": "<string>",
  "usage": {
    "input_tokens": 123,
    "output_tokens": 123
  }
}
Use /v1/messages to send Anthropic Messages API payloads through the gateway. The gateway normalizes your request into its internal IR, routes it to a compatible provider surface, and emits Anthropic-formatted responses.

Endpoint

POST /v1/messages

Streaming

Set stream: true to receive server-sent events in Anthropic format (message_start, content_block_*, message_delta, message_stop).

Notes

  • The gateway honors the Anthropic request shape and supports tool use fields.
  • At current request-validation level, stream: true with tools is rejected with 400 invalid_request. Use non-streaming tool loops for now.
  • The X-AIStats-Strictness header controls how unsupported parameters are handled.

Authorizations

Authorization
string
header
required

Bearer token authentication

Body

application/json
model
string
required
messages
object[]
required
Minimum array length: 1
max_tokens
integer
required
Required range: x >= 1
system
temperature
number
Required range: 0 <= x <= 1
top_p
number
Required range: 0 <= x <= 1
top_k
integer
Required range: x >= 1
tools
object[]
tool_choice
stream
boolean
metadata
object
reasoning
object
stop_sequences
string[]
provider_options
object

Optional provider-specific options.

usage
boolean
meta
boolean
echo_upstream_request
boolean
debug
object

Gateway debug controls. These flags are never forwarded upstream.

provider
object

Provider routing preferences for gateway selection.

Response

Message response

id
string
type
string
role
enum<string>
Available options:
assistant
model
string
content
object[]
stop_reason
string
stop_sequence
string
usage
object
Last modified on February 18, 2026