Using the Command Line Interface for Batch Jobs¶
The instructor CLI provides comprehensive functionalities for managing batch jobs across multiple providers with a unified interface. This multi-provider support allows users to leverage the strengths of different AI providers for their batch processing needs.
Supported Providers¶
- OpenAI: Utilizes OpenAI's robust batch processing capabilities with metadata support
- Anthropic: Leverages Anthropic's advanced language models with cancel/delete operations
The CLI uses a unified --provider
flag for all commands, with backward compatibility for legacy flags.
$ instructor batch --help
Usage: instructor batch [OPTIONS] COMMAND [ARGS]...
Manage OpenAI Batch jobs
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ --help Show this message and exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
โญโ Commands โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ cancel Cancel a batch job โ
โ create Create batch job using BatchProcessor โ
โ create-from-file Create a batch job from a file โ
โ delete Delete a completed batch job โ
โ download-file Download the file associated with a batch job โ
โ list See all existing batch jobs โ
โ results Retrieve results from a batch job โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Creating a Batch Job¶
List Jobs with Enhanced Display¶
$ instructor batch list --help
Usage: instructor batch list [OPTIONS]
See all existing batch jobs
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ --limit INTEGER Total number of batch jobs โ
โ to show โ
โ [default: 10] โ
โ --poll INTEGER Time in seconds to wait โ
โ for the batch job to โ
โ complete โ
โ [default: 10] โ
โ --screen --no-screen Enable or disable screen โ
โ output โ
โ [default: no-screen] โ
โ --live --no-live Enable live polling to โ
โ continuously update the โ
โ table โ
โ [default: no-live] โ
โ --provider TEXT Provider to use (e.g., โ
โ 'openai', 'anthropic') โ
โ [default: openai] โ
โ --use-anthropic --no-use-anthropic [DEPRECATED] Use --model โ
โ instead. Use Anthropic API โ
โ instead of OpenAI โ
โ [default: โ
โ no-use-anthropic] โ
โ --help Show this message and โ
โ exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
The enhanced list command now shows rich information including timestamps, duration, and provider-specific metrics:
$ instructor batch list --provider openai --limit 3
Openai Batch Jobs
โโโโโโโโโโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโณโโโโโโโโโโณโโโโโโโโโณโโโโโโโโ
โ Batch ID โ Status โ Created โ Started โ Durationโ Completedโ Failed โ Total โ
โกโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฉ
โ batch_abc123... โ completed โ 07/07 โ 07/07 โ 2m โ 15 โ 0 โ 15 โ
โ โ โ 23:48 โ 23:48 โ โ โ โ โ
โ batch_def456... โ processing โ 07/07 โ 07/07 โ 45m โ 8 โ 0 โ 10 โ
โ โ โ 22:30 โ 22:31 โ โ โ โ โ
โ batch_ghi789... โ failed โ 07/07 โ N/A โ N/A โ 0 โ 5 โ 5 โ
โ โ โ 21:15 โ โ โ โ โ โ
โโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโดโโโโโโโโ
$ instructor batch list --provider anthropic --limit 2
Anthropic Batch Jobs
โโโโโโโโโโโโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโโโโณโโโโโโโโโโณโโโโโโโโโโณโโโโโโโโโโณโโโโโโโโโโโโโโ
โ Batch ID โ Status โ Created โ Started โ Durationโ Succeededโ Errored โ Processing โ
โกโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฉ
โ msgbatch_abc123... โ completed โ 07/08 โ 07/08 โ 1m โ 20 โ 0 โ 0 โ
โ โ โ 03:47 โ 03:47 โ โ โ โ โ
โ msgbatch_def456... โ processing โ 07/08 โ 07/08 โ 15m โ 5 โ 0 โ 10 โ
โ โ โ 03:30 โ 03:30 โ โ โ โ โ
โโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโ
Create From File with Metadata Support¶
You can create batch jobs directly from pre-formatted .jsonl files with enhanced metadata support:
$ instructor batch create-from-file --help
Usage: instructor batch create-from-file [OPTIONS]
Create a batch job from a file
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ * --file-path TEXT File containing the โ
โ batch job requests โ
โ [default: None] โ
โ [required] โ
โ --model TEXT Model in format โ
โ 'provider/model-name' โ
โ (e.g., 'openai/gpt-4', โ
โ 'anthropic/claude-3-sโฆ โ
โ [default: โ
โ openai/gpt-4o-mini] โ
โ --description TEXT Description/metadata โ
โ for the batch job โ
โ [default: Instructor โ
โ batch job] โ
โ --completion-window TEXT Completion window for โ
โ the batch job (OpenAI โ
โ only) โ
โ [default: 24h] โ
โ --use-anthropic --no-use-anthropic [DEPRECATED] Use โ
โ --model instead. Use โ
โ Anthropic API instead โ
โ of OpenAI โ
โ [default: โ
โ no-use-anthropic] โ
โ --help Show this message and โ
โ exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Example usage with metadata:
# OpenAI batch with custom metadata
instructor batch create-from-file \
--file-path batch_requests.jsonl \
--model "openai/gpt-4o-mini" \
--description "Email classification batch - production v2.1" \
--completion-window "24h"
# Anthropic batch
instructor batch create-from-file \
--file-path batch_requests.jsonl \
--model "anthropic/claude-3-5-sonnet-20241022" \
--description "Text analysis batch"
For creating .jsonl files, you can use the enhanced BatchProcessor
:
from instructor.batch import BatchProcessor
from pydantic import BaseModel, Field
from typing import Literal
class Classification(BaseModel):
label: Literal["SPAM", "NOT_SPAM"] = Field(
..., description="Whether the email is spam or not"
)
# Create processor
processor = BatchProcessor("openai/gpt-4o-mini", Classification)
# Prepare message conversations
messages_list = [
[
{"role": "system", "content": "Classify the following email"},
{"role": "user", "content": "Hello there I'm a Nigerian prince and I want to give you money"}
],
[
{"role": "system", "content": "Classify the following email"},
{"role": "user", "content": "Meeting with Thomas has been set at Friday next week"}
]
]
# Create batch file
processor.create_batch_from_messages(
messages_list=messages_list,
file_path="batch_requests.jsonl",
max_tokens=100,
temperature=0.1
)
Job Management Operations¶
Cancelling a Batch Job¶
Cancel running batch jobs across all providers:
$ instructor batch cancel --help
Usage: instructor batch cancel [OPTIONS]
Cancel a batch job
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ * --batch-id TEXT Batch job ID to cancel โ
โ [default: None] โ
โ [required] โ
โ --provider TEXT Provider to use (e.g., โ
โ 'openai', 'anthropic') โ
โ [default: openai] โ
โ --use-anthropic --no-use-anthropic [DEPRECATED] Use โ
โ --provider 'anthropic' โ
โ instead. Use Anthropic API โ
โ instead of OpenAI โ
โ [default: โ
โ no-use-anthropic] โ
โ --help Show this message and โ
โ exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Examples:
# Cancel OpenAI batch
instructor batch cancel --batch-id batch_abc123 --provider openai
# Cancel Anthropic batch
instructor batch cancel --batch-id msgbatch_def456 --provider anthropic
Deleting a Batch Job¶
Delete completed batch jobs (supported by Anthropic):
$ instructor batch delete --help
Usage: instructor batch delete [OPTIONS]
Delete a completed batch job
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ * --batch-id TEXT Batch job ID to delete [default: None] [required] โ
โ --provider TEXT Provider to use (e.g., 'openai', 'anthropic') โ
โ [default: openai] โ
โ --help Show this message and exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Examples:
# Delete Anthropic batch (supported)
instructor batch delete --batch-id msgbatch_abc123 --provider anthropic
# Try to delete OpenAI batch (shows helpful message)
instructor batch delete --batch-id batch_ghi789 --provider openai
# Note: OpenAI does not support batch deletion via API
Retrieving Batch Results¶
Get structured results from completed batch jobs:
$ instructor batch results --help
Usage: instructor batch results [OPTIONS]
Retrieve results from a batch job
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ * --batch-id TEXT Batch job ID to get results from โ
โ [default: None] โ
โ [required] โ
โ * --output-file TEXT File to save the results to [default: None] โ
โ [required] โ
โ --model TEXT Model in format 'provider/model-name' (e.g., โ
โ 'openai/gpt-4', 'anthropic/claude-3-sonnet') โ
โ [default: openai/gpt-4o-mini] โ
โ --help Show this message and exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Examples:
# Get OpenAI batch results
instructor batch results \
--batch-id batch_abc123 \
--output-file openai_results.jsonl \
--model "openai/gpt-4o-mini"
# Get Anthropic batch results
instructor batch results \
--batch-id msgbatch_def456 \
--output-file anthropic_results.jsonl \
--model "anthropic/claude-3-5-sonnet-20241022"
Downloading Raw Files (Legacy)¶
For compatibility, the download-file command is still available:
$ instructor batch download-file --help
Usage: instructor batch download-file [OPTIONS]
Download the file associated with a batch job
โญโ Options โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ * --batch-id TEXT Batch job ID to download โ
โ [default: None] โ
โ [required] โ
โ * --download-file-path TEXT Path to download file to โ
โ [default: None] โ
โ [required] โ
โ --provider TEXT Provider to use (e.g., 'openai', โ
โ 'anthropic') โ
โ [default: openai] โ
โ --help Show this message and exit. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Provider Support Matrix¶
Operation | OpenAI | Anthropic |
---|---|---|
List | โ Enhanced table | โ Enhanced table |
Create | โ With metadata | โ File-based |
Cancel | โ Standard API | โ Standard API |
Delete | โ Not supported | โ Standard API |
Results | โ Structured parsing | โ Structured parsing |
Enhanced Features¶
- Rich CLI Tables: Color-coded status, timestamps, duration calculations
- Metadata Support: Add descriptions and custom fields to organize batches
- Unified Commands: Same interface works across all providers
- Provider Detection: Automatic provider detection from model strings
- Error Handling: Clear error messages and helpful notes for unsupported operations
- Backward Compatibility: Legacy flags still work with deprecation warnings
This comprehensive CLI interface provides efficient batch job management across all supported providers with enhanced monitoring and control capabilities.