Instructor Compatibility
Run VLM-1 with the Instructor Python SDK with minimal code changes.
With our new OpenAI-compatible API, you can use the popular Instructor library to interact with VLM Run. This allows developers to switch between OpenAI, Instructor and VLM Run APIs with minimal changes.
Configure the OpenAI-compatible Instructor Client with VLM Run
Since Instructor is compatible with the OpenAI API, you can use the same configuration methods as described in the OpenAI Compatibility page.
Here’s an example of how to configure the Instructor client to work with VLM Run:
Usage: Chat Completion with Instructor and VLM Run
Now that you have configured the Instructor client, you can use the inst_client.chat.completions.create
method to interact with VLM Run.
Below is an example of how to use the Instructor client to create a chat completion:
Usage: Guided Chat Completion with Instructor and VLM Run
The example above is an unconstrained chat completion request, relying solely on the JSON schema provided to the VLM as an instruction. In order to enforce the JSON schema, you can use the json_schema
extra body parameter to guide the VLM Run model to extract structured data from the image.
This will ensure that the VLM Run model extracts the invoice details in the specified JSON format.