Route any prompt to the best-performing LLM using peer-reviewed council rankings from LLM Council
Route any prompt to the best-performing LLM. The API finds the top model for a given query based on thousands of peer-reviewed council deliberations — then you call that model directly.
使用 Arthas 的 watch/trace 获取 EagleEye traceId / 获取请求的 traceId