Skip to content

Structured Output Integrations

Welcome to the Instructor integrations guide. This section provides detailed information about using structured outputs with various AI model providers.

Supported Providers

Instructor supports a wide range of AI model providers, each with their own capabilities and features:

OpenAI-Compatible Models

  • OpenAI - GPT-3.5, GPT-4, and other OpenAI models
  • Azure OpenAI - Microsoft's Azure-hosted OpenAI models

Open Source & Self-Hosted Models

Cloud AI Providers

  • Anthropic - Claude and Claude 2 models
  • Google - PaLM and Gemini models
  • Vertex AI - Google Cloud's AI platform
  • Cohere - Command-R and other Cohere models
  • Groq - High-performance inference platform
  • Mistral - Mistral's hosted models
  • Fireworks - High-performance model inference
  • Cerebras - Llama-3-70B and other Open Source Models at blazing fast inference speeds
  • Writer - Palmyra-X-004 and other Writer models

Model Management

  • LiteLLM - Unified interface for multiple providers

Common Concepts

All integrations share some common concepts:

Need Help?

If you need help with a specific integration:

  1. Check the provider-specific documentation
  2. Look at the examples
  3. Check our GitHub issues
  4. Join our Discord community