Installing Instructor¶
Instructor is a Python library that works with various LLM providers to extract structured outputs. This guide covers installation and setting up API keys for different providers.
Basic Installation¶
Install the core Instructor package with pip:
Instructor requires Pydantic for defining data models:
Setting Up with Different LLM Providers¶
OpenAI¶
OpenAI is the default provider and works out of the box:
Set up your OpenAI API key:
Anthropic (Claude)¶
To use with Anthropic's Claude models:
Set up your Anthropic API key:
Google Gemini¶
To use with Google's Gemini models:
Set up your Google API key:
Cohere¶
To use with Cohere's models:
Set up your Cohere API key:
Mistral¶
To use with Mistral AI's models:
Set up your Mistral API key:
LiteLLM (Multiple Providers)¶
To use LiteLLM for accessing multiple providers:
Set up API keys for the providers you want to use.
Verifying Your Installation¶
You can verify your installation by running a simple extraction:
import instructor
from openai import OpenAI
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
client = instructor.from_openai(OpenAI())
person = client.chat.completions.create(
model="gpt-3.5-turbo",
response_model=Person,
messages=[
{"role": "user", "content": "John Doe is 30 years old"}
]
)
print(f"Name: {person.name}, Age: {person.age}")
Next Steps¶
Now that you've installed Instructor, you can:
- Create your first extraction with a simple model
- Understand the different response models available
- Set up clients for your preferred LLM provider
Check the Client Setup guide to learn how to configure clients for different providers.