This is an old revision of the document!
Table of Contents
AI Mod Setup Guide
This page explains how to configure different AI providers for the AI Mod. Choose the provider that best fits your needs - some are free, others require payment or self-hosting.
OpenRouter (Recommended)
OpenRouter allows you to access a wide selection of models via a single API.
Steps to set up:
- Create an API key here: https://openrouter.ai/settings/keys
- Copy the key and paste it into the “API key” field in the provider settings.
- Browse available models here: https://openrouter.ai/models
- Copy the Model ID (click the button next to the model name) and paste it into the “Model” field in the provider settings.
Notes:
- Some models are free, others are paid (via per-request credits).
- Many popular models are available, including GPT-4, Claude, Mistral, MythoMax, and more.
OpenAI
This provider supports all models exposed through the official OpenAI API, including GPT-3.5 and GPT-4.
Steps to set up:
- Create an API key here: https://platform.openai.com/api-keys
- Paste the key into the “API key” field in the provider settings.
- Use the model name (e.g. `gpt-3.5-turbo` or `gpt-4`) in the “Model” field.
Notes:
- Usage is billed based on tokens (input + output).
- Official, stable access to OpenAI models.
- No NSFW content allowed — requests may be blocked or filtered.
Ollama (Self-Hosted)
Ollama allows you to run models locally on your own machine.
Steps to set up:
- Download and install Ollama: https://ollama.com/
- Launch it and pull a model, e.g. `ollama run llama3`
- In the provider settings, use:
- API URL: `http://localhost:11434`
- Model: `llama3` (or any model you've pulled)
Notes:
- Runs locally, no data is sent to third parties.
- You need a capable GPU for most models.
- Useful for full privacy and offline use.
Cloudflare AI Gateway
A proxy layer to route requests to other AI providers through Cloudflare infrastructure.
Steps to set up:
- Visit the documentation: https://developers.cloudflare.com/ai-gateway/
- Register and configure your gateway.
- Use the provided endpoint and key in your provider settings.
Notes:
- Free tier available with some limits.
Compatibility with OpenAI-like APIs
Many other providers (e.g., Together.ai, DeepInfra, Groq) support OpenAI-compatible APIs. If a provider gives you:
- a base URL,
- an API key, and
- a model name,
…you can usually plug those into the AI Mod using the “OpenAI” provider option.
Important:
- Some models may block or filter adult content.
- If using NSFW content, prefer self-hosted or OpenRouter models that explicitly allow it.