Structured Output Integrations¶
Welcome to the Instructor integrations guide. This section provides detailed information about using structured outputs with various AI model providers.
Supported Providers¶
Instructor supports a wide range of AI model providers, each with their own capabilities and features:
OpenAI-Compatible Models¶
- OpenAI - GPT-3.5, GPT-4, and other OpenAI models
- Azure OpenAI - Microsoft's Azure-hosted OpenAI models
Open Source & Self-Hosted Models¶
- Ollama - Run open-source models locally
- llama-cpp-python - Python bindings for llama.cpp
- Together AI - Host and run open source models
- Cortex - Run open source models with Cortex
Cloud AI Providers¶
- Anthropic - Claude and Claude 2 models
- Google - PaLM and Gemini models
- Vertex AI - Google Cloud's AI platform
- Cohere - Command-R and other Cohere models
- Groq - High-performance inference platform
- Mistral - Mistral's hosted models
- Fireworks - High-performance model inference
- Cerebras - Llama-3-70B and other Open Source Models at blazing fast inference speeds
- Writer - Palmyra-X-004 and other Writer models
- DeepSeek - DeepSeek Coder and DeepSeek Chat models
Model Management¶
- LiteLLM - Unified interface for multiple providers
Common Concepts¶
All integrations share some common concepts:
Need Help?¶
If you need help with a specific integration:
- Check the provider-specific documentation
- Look at the examples
- Check our GitHub issues
- Join our Discord community