Providers

Setting up your First Provider

  1. Press Add Entry

  2. Select the provider you want to set up

  3. Fill out required information

  4. Press Submit.

Reconfigure a Provider

LLM Vision combines multiple AI providers into an easy to use integration. These AI providers are supported:

If you are unsure which model to use, there is a comparison available here: Choosing the right model

Cloud based Providers

Easy to set up and blazingly fast

Pay as you go

Supported Models

gpt-4ogpt-4o-mini
Pay as you go

Supported Models

claude3.5-sonnetclaude3.5-haiku
FreePay as you go

Supported Models

gemini-2.0gemini-1.5
Free

Supported Models

llama3.2-vision
Pay as you go

Amazon Novaclaude3.5-sonnetclaude3.5-haiku

Self-hosted Providers

Achieve maximum privacy by hosting LLMs on a local machine

Self-hosted

Supported Models

gemma3llama3.2-visionminicpm-vllava1.6
Self-hosted

Supported Models

gemma3llama3.2-visionminicpm-vllava1.6
Self-hosted

Supported Models

gemma3llama3.2-visionminicpm-vllava1.6

Setup

Each provider is slightly different but most will require and API key. Self-hosted providers need a base url and port.

To setup your first provider:

  1. Navigate to Devices & Services in the Home Assistant settings

  2. Add Integration

  3. Search for 'LLM Vision'

  4. Select the provider you want to set up from the dropdown

  5. Enter all required details

On success you will see LLM Vision in your integrations. From here you can set up new providers or delete existing ones.

Using Providers in Actions

You can have multiple different configurations per provider. This is especially useful for local providers, for example when you have two machines hosting different models.

Multiple configurations for Ollama with different hosts

When running an action, you can select one of your provider configurations:

Selecting a provider

Last updated

Was this helpful?