AI Providers

Supported AI providers and how to configure them.

Supported Providers

Transcribe works with any OpenAI-compatible API. Built-in presets:

Provider Endpoint Key Required
Ollama http://localhost:11434 No
OpenAI https://api.openai.com/v1 Yes
OpenRouter https://openrouter.ai/api/v1 Yes
Mistral https://api.mistral.ai/v1 Yes
Gemini https://generativelanguage.googleapis.com/v1beta/openai Yes
Groq https://api.groq.com/openai/v1 Yes
Together AI https://api.together.xyz/v1 Yes
Custom You specify Configurable

Ollama (Local/Free)

Ollama runs AI models locally on your Mac. No API key needed, completely private.

  1. Install from ollama.com.
  2. Download a model (e.g. ollama pull llama3).
  3. Transcribe auto-detects when Ollama is running.

OpenAI

Get API key from platform.openai.com. Recommended models:

Custom Providers

Select "Custom" as your provider. Enter the base URL for any OpenAI-compatible API. The chat completions endpoint should be at {base_url}/v1/chat/completions or {base_url}/chat/completions.