LLM Provider Integration Tutorials¶
Learn how to integrate Instructor with various AI model providers. These comprehensive tutorials cover everything from cloud-based services like OpenAI and Anthropic to local open-source models, helping you extract structured outputs from any LLM.
-
Major Cloud Providers
Leading AI providers with comprehensive features
OpenAI · OpenAI Responses · Azure · Anthropic · Google.GenerativeAI · Vertex AI · AWS Bedrock · Google.GenAI
-
Additional Cloud Providers
Other commercial AI providers with specialized offerings
Cohere · Mistral · DeepSeek · Together AI · Groq · Fireworks · Cerebras · Writer · Perplexity SambaNova
-
Open Source
Run open-source models locally or in the cloud
-
Routing
Unified interfaces for multiple providers
Common Features¶
All integrations support these core features:
| Feature | Description | Documentation |
|---|---|---|
| Model Patching | Enhance provider clients with structured output capabilities | Patching |
| Response Models | Define expected response schema with Pydantic | Models |
| Validation | Ensure responses match your schema definition | Validation |
| Streaming | Stream partial or iterative responses | Partial, Iterable |
| Hooks | Add callbacks for monitoring and debugging | Hooks |
However, each provider has different capabilities and limitations. Refer to the specific provider documentation for details.
Provider Modes¶
Providers support different methods for generating structured outputs:
| Mode | Description | Providers |
|---|---|---|
TOOLS | Uses OpenAI-style tools/function calling | OpenAI, Anthropic, Mistral |
PARALLEL_TOOLS | Multiple simultaneous tool calls | OpenAI |
JSON | Direct JSON response generation | OpenAI, Gemini, Cohere, Perplexity |
MD_JSON | JSON embedded in markdown | Most providers |
BEDROCK_TOOLS | AWS Bedrock function calling | AWS Bedrock |
BEDROCK_JSON | AWS Bedrock JSON generation | AWS Bedrock |
PERPLEXITY_JSON | Perplexity JSON generation | Perplexity |
See the Modes Comparison guide for details.
Getting Started¶
There are two ways to use providers with Instructor:
1. Using Provider Initialization (Recommended)¶
The simplest way to get started is using the provider initialization:
import instructor
from pydantic import BaseModel
class UserInfo(BaseModel):
name: str
age: int
# Initialize any provider with a simple string
client = instructor.from_provider("openai/gpt-4")
# Or use async client
async_client = instructor.from_provider("anthropic/claude-3-sonnet", async_client=True)
# Use the same interface for all providers
response = client.chat.completions.create(
response_model=UserInfo,
messages=[{"role": "user", "content": "Your prompt"}]
)
Supported provider strings: - openai/model-name: OpenAI models - anthropic/model-name: Anthropic models - google/model-name: Google models - mistral/model-name: Mistral models - cohere/model-name: Cohere models - perplexity/model-name: Perplexity models - groq/model-name: Groq models - writer/model-name: Writer models - bedrock/model-name: AWS Bedrock models - cerebras/model-name: Cerebras models - fireworks/model-name: Fireworks models - vertexai/model-name: Vertex AI models - genai/model-name: Google GenAI models - ollama/model-name: Ollama models
Provider Checklist¶
Use these example strings with from_provider to quickly get started:
-
instructor.from_provider("openai/gpt-4o-mini") -
instructor.from_provider("anthropic/claude-3-sonnet") -
instructor.from_provider("google/gemini-1.5-flash") -
instructor.from_provider("mistral/mistral-large-latest") -
instructor.from_provider("cohere/command-r") -
instructor.from_provider("perplexity/sonar-small") -
instructor.from_provider("groq/llama3-8b-8192") -
instructor.from_provider("writer/palmyra-x-004") -
instructor.from_provider("bedrock/anthropic.claude-3-sonnet-20240229-v1:0") -
instructor.from_provider("cerebras/llama3.1-70b") -
instructor.from_provider("fireworks/llama-v3-70b-instruct") -
instructor.from_provider("vertexai/gemini-1.5-flash") -
instructor.from_provider("genai/gemini-1.5-flash") -
instructor.from_provider("ollama/llama3")
2. Manual Client Setup¶
Alternatively, you can manually set up the client:
-
Install the required dependencies:
-
Import the provider client and patch it with Instructor:
-
Use the patched client with your Pydantic model:
For provider-specific setup and examples, visit each provider's documentation page.
Need Help?¶
If you need assistance with a specific integration:
- Check the provider-specific documentation
- Browse the examples and cookbooks
- Search existing GitHub issues
- Join our Discord community