Skip to content

Supported providers

We support multiple LLM providers under a unified interface. Besides the table below, other providers can be used as an OpenAI-compatible API by specifying the baseUrl option in the OpenAIChatModel configuration.

ProviderSampling ParamsFunction CallingStructured OutputText InputImage InputAudio InputCitation 1Text OutputImage OutputAudio OutputReasoning
OpenAI (Responses)✅ except top_k,frequency_penalty, presence_penalty, seed
OpenAI (Chat Completion)✅ except top_k
Anthropic✅ except frequency_penalty, presence_penalty, seed
Google
Cohere
Mistral✅ except top_k🚧

Keys:

  • ✅: Supported
  • 🚧: Not yet implemented
  • ➖: Not available from provider
  1. Source Input (citation) is not supported by all providers and may be converted to compatible inputs instead.