Python SDK patterns for Opik. Use when working in sdks/python, on SDK APIs, integrations, or message processing.
Layer 1: Public API (opik.Opik, @opik.track)
↓
Layer 2: Message Processing (queue, batching, retry)
↓
Layer 3: REST Client (OpikApi, HTTP)
# ✅ REQUIRED for async operations
client = opik.Opik()
# ... tracing operations ...
client.flush() # Must call before exit!
Async (via message queue) - fire-and-forget:
trace(), span()log_traces_feedback_scores()experiment.insert()Sync (blocking, returns data):
create_dataset(), get_dataset()create_prompt(), get_prompt()search_traces(), search_spans()# ✅ GOOD - integration files assume dependency exists
import anthropic # Only imported when user uses integration
# ❌ BAD - importing at package level
from opik.integrations import anthropic # Would fail if not installed
Library has callbacks? → Pure Callback (LangChain, LlamaIndex)
No callbacks? → Method Patching (OpenAI, Anthropic)
Callbacks unreliable? → Hybrid (ADK)
from opik.integrations.anthropic import track_anthropic
client = anthropic.Anthropic()
tracked_client = track_anthropic(client) # Wraps methods
from opik.integrations.langchain import OpikTracer
tracer = OpikTracer()
chain.invoke(input, config={"callbacks": [tracer]})
@opik.track
def my_function(input: str) -> str:
# Auto-creates span, captures input/output
return process(input)
>=2.0.0,<3.0.0Messages batch together for efficiency:
# CRUD: create/get/list/update/delete
client.create_experiment(name="exp")
client.get_dataset(name="ds")
# Search for complex queries
client.search_spans(project_name="proj")
client.search_traces(project_name="proj")
# Batch for bulk operations
client.batch_create_items(...)
PyTorch深度学习模式与最佳实践,用于构建稳健、高效且可复现的训练流程、模型架构和数据加载。