Add email capabilities to your self-hosted Open WebUI instance through MCP. Manage email alongside local or cloud LLMs.
Open WebUI is a self-hosted web interface for large language models that supports Ollama, OpenAI-compatible APIs, and a growing ecosystem of plugins. With MCP integration, you can extend Open WebUI with external tools like MultiMail, adding email capabilities to your private AI chat environment.
Open WebUI supports MCP servers through its tool system, primarily using SSE transport. Configuration is managed through the admin settings panel in the web interface. Once connected, MultiMail's email tools are available to all conversations and can be used with any supported model backend.
For organizations running local LLMs through Ollama and wanting email automation, the Open WebUI and MultiMail combination provides AI-powered email with full control over your infrastructure. Your model runs locally, your chat stays private, and only the email operations go through MultiMail's API.
Sign up at multimail.dev and create an API key from your dashboard. This key authenticates all MCP server requests.
In the MultiMail dashboard, create a mailbox for your Open WebUI instance. Set oversight to gated_send so outbound emails require your approval.
Navigate to Admin Settings > Tools in your Open WebUI instance. Add a new MCP server connection with the MultiMail configuration. Open WebUI primarily uses SSE transport for MCP servers.
{
"mcpServers": {
"multimail": {
"command": "npx",
"args": ["-y", "@multimail/mcp-server"],
"env": {
"MULTIMAIL_API_KEY": "mm_live_your_key_here"
}
}
}
}In Open WebUI's model settings, ensure that tool use is enabled for the model you want to use with MultiMail. Not all models support function calling, so choose one that does (e.g., GPT-4, Claude, or a capable local model).
Start a new chat and ask the AI to list your MultiMail mailboxes. If configured correctly, the email tools will appear in the tool selection and the model will be able to use them.
| Tool | Description | Example |
|---|---|---|
| send_email | Send an email from your agent mailbox to any recipient. | Send a research summary to collaborators from your private chat. |
| reply_email | Reply to a received email, maintaining the thread context. | Reply to a team discussion using insights from your local AI. |
| check_inbox | Check a mailbox for new or unread messages. | Check for new emails from your self-hosted chat interface. |
| read_email | Read the full content of a specific email including headers and body. | Read an email and ask your local model to analyze its contents. |
| create_mailbox | Create a new agent mailbox on your MultiMail account. | Create a mailbox for your organization's AI assistant. |
| list_mailboxes | List all mailboxes on your MultiMail account. | View available mailboxes before composing an email. |
| get_thread | Retrieve an entire email thread to see the full conversation history. | Pull a full email thread for the AI to analyze and summarize. |
| search_contacts | Search your contact list by name, email, or tags. | Find contacts to include in a group email notification. |
| add_contact | Add a new contact to your MultiMail contact list. | Save a new contact's email after receiving their message. |
| tag_email | Tag or categorize an email for organization and filtering. | Tag emails by topic for later retrieval and analysis. |
| list_pending | List all emails currently awaiting human approval. | Review pending outbound emails from the admin dashboard. |
| decide_email | Approve or reject a pending email in the oversight queue. | Approve a response email after reviewing it in the web UI. |
{
"mcpServers": {
"multimail": {
"command": "npx",
"args": ["-y", "@multimail/mcp-server"],
"env": {
"MULTIMAIL_API_KEY": "mm_live_your_key_here"
}
}
}
}MCP server configuration for Open WebUI's admin settings.
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
environment:
- OLLAMA_BASE_URL=http://ollama:11434
- MULTIMAIL_API_KEY=mm_live_your_key_here
volumes:
- open-webui:/app/backend/data
ollama:
image: ollama/ollama
volumes:
- ollama:/root/.ollama
volumes:
open-webui:
ollama:Run Open WebUI with Ollama and MCP support via Docker Compose.
Check my MultiMail inbox for the most recent email. Read it and give me a summary. If it requires a response, draft one and show it to me before sending.Use a local Ollama model with MultiMail for private AI email.
Read the latest support email from my inbox. Search our uploaded knowledge base documents for relevant information, then draft a comprehensive reply that references our documentation.Combine Open WebUI's RAG pipeline with MultiMail for informed replies.
Not all LLMs support function calling well. For reliable MCP tool use in Open WebUI, use models known for strong tool support: GPT-4/4o, Claude 3.5+, or local models specifically fine-tuned for function calling. Smaller local models may struggle with MCP tool invocation.
Open WebUI's RAG pipeline lets you upload documents as knowledge sources. When composing emails, the AI can reference your uploaded documentation, making email responses more accurate and consistent with your organization's information.
Open WebUI supports multiple users with role-based access. Admin users can configure MCP servers that are available to all users. Use MultiMail's oversight mode to review emails from any user before they're sent.
Open WebUI's MCP support favors SSE transport. If stdio doesn't work directly, you may need to use a stdio-to-SSE bridge or check Open WebUI's latest documentation for updated MCP transport support.
Email infrastructure built for AI agents. Verifiable identity, graduated oversight, and a 38-tool MCP server. Formally verified in Lean 4.