Builds and optimizes DSPy (dspy) programs end-to-end: signatures, modules, compilation/optimization, evaluation, and debugging. Use when the user mentions dspy/DSPy, Signature, Module, teleprompter/optimizer, compile, evaluate, few-shot, RAG, tool use, or local LLM endpoints (Ollama/vLLM/LM Studio).
Signature, one/few modules, tight eval loop.Signature with the smallest useful fields.Predict, ChainOfThought, or a tiny custom Module).train / dev / test splits (even if small).metric(example, pred) -> float/bool.train, select by dev, report final on test.Use a local OpenAI-compatible base URL when available. Prefer configuring via environment variables or a single “LM factory” in code.
Minimal pattern (adjust to your DSPy version):
import os
import dspy
# Example OpenAI-compatible local endpoint (adjust as needed)
os.environ.setdefault("OPENAI_API_BASE", "http://localhost:11434/v1")
os.environ.setdefault("OPENAI_API_KEY", "ollama") # placeholder for local gateways
# Model name depends on your gateway (e.g., "llama3.1", "qwen2.5", etc.)
lm = dspy.LM(model=os.environ.get("DSPY_MODEL", "qwen3:latest"))
dspy.settings.configure(lm=lm)
If the repo already has a working local-LLM helper, reuse it instead of re-inventing configuration.
Signature with explicit output fields (and constraints like allowed labels).Follow this order:
pyproject.toml, lockfile, or import behavior).