Skip to content

Providers Overview

multi-llm-ts supports multiple LLM providers with a consistent API. This page provides an overview of supported providers and their capabilities.

Supported Providers

ProviderIDCompletionVisionFunction CallingReasoningStructured OutputUsage ReportingComputer Use
Anthropicanthropic
Azure AIazure
Cerebrascerebras
DeepSeekdeepseek
Googlegoogle✅¹
Groqgroq
Meta/Llamameta
MistralAImistralai✅¹
Ollamaollama
OpenAIopenai✅²✅²
OpenRouteropenrouter
TogetherAI³openai✅²✅²
xAIxai

Notes:

  1. Provider supports JSON output but does not enforce a specific schema. You need to describe the schema in the user message.
  2. Not supported for o1 family models.
  3. TogetherAI uses the openai provider ID. Set baseURL to https://api.together.xyz/v1

Feature Descriptions

Completion & Streaming

All providers support both synchronous completion and streaming responses.

Vision

Vision-capable models can analyze images provided as attachments. See Vision Guide for usage.

Function Calling

Function calling allows models to invoke tools/plugins. See Function Calling Guide for usage.

Reasoning

Reasoning models like OpenAI's o1 family use chain-of-thought reasoning for complex problems.

Structured Output

Generate JSON responses validated against a schema using Zod. See Structured Output Guide.

Usage Reporting

Tracks token usage (prompt, completion, and total tokens) for cost estimation.

Computer Use

Experimental feature allowing models to interact with computer interfaces (currently Anthropic and Google only).

Configuration

Basic configuration for any provider:

typescript
import { igniteModel, loadModels } from 'multi-llm-ts'

const config = {
  apiKey: 'YOUR_API_KEY',           // Required for cloud providers
  baseURL: 'https://api.custom.com', // Optional custom endpoint
  timeout: 30000,                    // Optional request timeout (ms)
}

const models = await loadModels('PROVIDER_ID', config)
const model = igniteModel('PROVIDER_ID', models.chat[0], config)

Released under the MIT License.