Quickstart
First request in under 2 minutes. If you have used the OpenAI SDK before, you already know how this works.
Get an API key
Contact us to receive your API key. Once provisioned, your key grants immediate access to all available models on the Continuum Inference platform.
Request API AccessInstall the SDK
Continuum uses the standard OpenAI SDK. No proprietary library required. Install for your language:
No installation required. Any HTTP client works.
Make your first request
Send a chat completion request. The API is identical to OpenAI — same request format, same response format, same SDK methods.
Python
TypeScript
cURL
Migrate from OpenAI
If you have an existing OpenAI integration, migration is two lines. Your prompts, tools, streaming, and response handling stay identical.
Migrate from Anthropic
If you are using the Anthropic Python SDK, switch to the OpenAI SDK with Continuum as the base URL. The message format is the same (system, user, assistant roles). Tool definitions use the OpenAI format.
Note: The Anthropic SDK uses client.messages.create() while the OpenAI SDK uses client.chat.completions.create(). The message format (roles and content) is compatible. Tool definitions differ slightly — Anthropic uses input_schema while OpenAI uses parameters. See the tool calling guide for details.
Response format
Responses follow the standard OpenAI chat completion format. If your code already parses OpenAI responses, it works with Continuum without changes.
Key fields
choices[0].message.contentThe model response text.choices[0].message.reasoning_contentThe reasoning chain (only present when thinking mode is enabled).choices[0].message.tool_callsTool call requests (only present when the model decides to call a tool).choices[0].finish_reason"stop" (natural end), "length" (hit max_tokens), "tool_calls" (model wants to call a tool).usage.total_tokensTotal tokens consumed (prompt + completion). This is what you are billed on.Next steps
API Reference
Full parameter documentation for the chat completions endpoint.
Tool Calling
Use function calling to connect the model to external tools and data.
Thinking Modes
Enable extended reasoning for complex analytical tasks.
JSON Output
Get guaranteed valid JSON responses for structured extraction.
Models & Benchmarks
Detailed benchmarks and capability comparison against Claude and GPT.
Pricing
Token pricing, savings calculator, and plan comparison.
Need help integrating?
Our team can help you migrate from Anthropic or OpenAI and optimise your deployment for cost and performance.