Configure the OpenAI-compatible Instructor Client with VLM Run
Since Instructor is compatible with the OpenAI API, you can use the same configuration methods as described in the OpenAI Compatibility page. Here’s an example of how to configure the Instructor client to work with VLM Run:Usage: Chat Completion with Instructor and VLM Run
Now that you have configured the Instructor client, you can use theinst_client.chat.completions.create
method to interact with VLM Run.
Below is an example of how to use the Instructor client to create a chat completion:
Usage: Guided Chat Completion with Instructor and VLM Run
The example above is an unconstrained chat completion request, relying solely on the JSON schema provided to the VLM as an instruction. In order to enforce the JSON schema, you can use thejson_schema
extra body parameter to guide the VLM Run model to extract structured data from the image.