Command-line interface for Novita AI - An OpenAI-compatible AI API client for DeepSeek, GLM, and other models.
A CLI harness for Novita AI - an OpenAI-compatible API service for AI models like DeepSeek, GLM, and others.
This CLI is installed as part of the cli-anything-novita package:
pip install cli-anything-novita
Prerequisites:
# Show help
cli-anything-novita --help
# Start interactive REPL mode
cli-anything-novita
# Chat with model
cli-anything-novita chat --prompt "What is AI?" --model deepseek/deepseek-v3.2
# Streaming chat
cli-anything-novita stream --prompt "Write a poem about code"
# List available models
cli-anything-novita models
# JSON output (for agent consumption)
cli-anything-novita --json chat --prompt "Hello"
When invoked without a subcommand, the CLI enters an interactive REPL session:
cli-anything-novita
# Enter commands interactively with tab-completion and history
Chat with AI models through the Novita API.
| Command | Description |
|---|---|
chat | Chat with the Novita API |
stream | Stream chat completion |
Session management for chat history.
| Command | Description |
|---|---|
status | Show session status |
clear | Clear session history |
history | Show command history |
Configuration management.
| Command | Description |
|---|---|
set | Set a configuration value |
get | Get a configuration value (or show all) |
delete | Delete a configuration value |
path | Show the config file path |
| Command | Description |
|---|---|
test | Test API connectivity |
models | List available models |
# Set API key via config file (recommended)
cli-anything-novita config set api_key "sk-xxx"
# Or use environment variable
export NOVITA_API_KEY="sk-xxx"
# Simple chat
cli-anything-novita chat --prompt "Explain quantum computing" --model deepseek/deepseek-v3.2
# Streaming chat
cli-anything-novita stream --prompt "Write a Python function to calculate factorial"
# Verify API key and connectivity
cli-anything-novita test --model deepseek/deepseek-v3.2
# List all available models
cli-anything-novita models
The Novita API supports multiple model providers:
| Model ID | Provider | Description |
|---|---|---|
deepseek/deepseek-v3.2 | DeepSeek | DeepSeek V3.2 model (default) |
zai-org/glm-5 | Zhipu AI | GLM-5 model |
minimax/minimax-m2.5 | MiniMax | MiniMax M2.5 model |
All commands support dual output modes:
--json flag): Structured JSON for agent consumption# Human output
cli-anything-novita chat --prompt "Hello"
# JSON output for agents
cli-anything-novita --json chat --prompt "Hello"
When using this CLI programmatically:
--json flag for parseable output1.0.0