vlmrun chat command enables visual AI chat with Orion directly from your terminal. Process images, videos, and documents with natural language prompts.
Basic Usage
Prompt Sources
Prompts can be provided in three ways (in precedence order):Using Skills
Pass a local skill directory with-k to apply skill instructions inline:
The
-k flag sends the skill inline with the request (no server-side upload). To create a persistent server-side skill, use vlmrun skills upload.Stateful Sessions
Use--session-id to persist chat history across multiple calls:
Models
| Model | Description |
|---|---|
vlmrun-orion-1:fast | Speed-optimized |
vlmrun-orion-1:auto | Balanced (default) |
vlmrun-orion-1:pro | Most capable |
Output Formats
Artifact Handling
When Orion generates artifacts (images, videos, etc.), they are automatically downloaded:Supported File Types
| Category | Extensions |
|---|---|
| Images | .jpg, .jpeg, .png, .gif, .webp, .bmp, .tiff |
| Videos | .mp4, .mov, .avi, .mkv, .webm |
| Documents | .pdf, .doc, .docx |
| Audio | .mp3, .wav, .m4a, .flac, .ogg |
Options Reference
| Option | Short | Description |
|---|---|---|
--prompt | -p | Prompt: text string, file path, or stdin |
--input | -i | Input file (repeatable) |
--skill | -k | Path to a skill directory (repeatable) |
--output | -o | Artifact output directory |
--model | -m | Model variant (default: vlmrun-orion-1:auto) |
--json | -j | Output JSON instead of formatted text |
--no-stream | -ns | Disable streaming |
--no-download | -nd | Skip artifact download |
--session-id | -s | Session UUID for stateful conversations |
--base-url | API base URL override |