VLM Run home page
Search...
⌘K
Contact
Hub
Cookbooks
Dashboard
Dashboard
Search...
Navigation
Get Started
API Reference
Documentation
MCP Server
API Reference
Python SDK
Node.js SDK
Get Started
API Reference
Models
GET
Health
GET
Get models
Hub
GET
List domains
POST
Get schema for domain
Generate
POST
Image -> JSON
POST
Doc -> JSON
POST
Doc Agent -> JSON
POST
Audio -> JSON
POST
Video -> JSON
Predictions
GET
Get Predictions by ID
GET
Get Predictions
Files
POST
Upload File
GET
Get File by ID
GET
List Files
Feedback
POST
Submit Feedback
GET
Get Feedback
Get Started
API Reference
Meet the
VLM Run API
for running production-ready multimodal models. Use it to extract data from PDFs, generate images, summarize videos, and more.
Base URL
:
https://api.vlm.run/v1
Auth
: Bearer token via the
Authorization: Bearer <VLMRUN_API_KEY>
header
Below is the quickest path to a successful request: check health, upload a file, then generate or process content.
Health
Assistant
Responses are generated using AI and may contain mistakes.