Skip to content

instructor

What's new in Instructor v2?

Instructor v2 is a large internal rewrite with a deliberately conservative public goal: the library should feel familiar to existing users, while becoming much easier to extend, reason about, and type-check.

The previous architecture accumulated a lot of provider-specific behavior in shared modules. That worked, but it made the codebase harder to grow. Adding a provider could mean touching response parsing, retry logic, multimodal handling, mode normalization, and client setup in several unrelated places.

V2 moves that logic into a provider-owned architecture. The external API stays recognizable. The internals become more explicit.

Announcing Responses API support

We're excited to announce Instructor's integration with OpenAI's new Responses API. This integration brings a more streamlined approach to working with structured outputs from OpenAI models. Let's see what makes this integration special and how it can improve your LLM applications.

Announcing unified provider interface

We are pleased to introduce a significant enhancement to Instructor: the from_provider() function. While Instructor has always focused on providing robust structured outputs, we've observed that many users work with multiple LLM providers. This often involves repetitive setup for each client.

The from_provider() function aims to simplify this process, making it easier to initialize clients and experiment across different models.

This new feature offers a streamlined, string-based method to initialize an Instructor-enhanced client for a variety of popular LLM providers.