Connect MultiMail to Open WebUI for AI Email

Add email capabilities to your self-hosted Open WebUI instance through MCP. Manage email alongside local or cloud LLMs.


Open WebUI is a self-hosted web interface for large language models that supports Ollama, OpenAI-compatible APIs, and a growing ecosystem of plugins. With MCP integration, you can extend Open WebUI with external tools like MultiMail, adding email capabilities to your private AI chat environment.

Open WebUI supports MCP servers through its tool system, primarily using SSE transport. Configuration is managed through the admin settings panel in the web interface. Once connected, MultiMail's email tools are available to all conversations and can be used with any supported model backend.

For organizations running local LLMs through Ollama and wanting email automation, the Open WebUI and MultiMail combination provides AI-powered email with full control over your infrastructure. Your model runs locally, your chat stays private, and only the email operations go through MultiMail's API.

Get started

1

Get your MultiMail API key

Sign up at multimail.dev and create an API key from your dashboard. This key authenticates all MCP server requests.

2

Create an agent mailbox

In the MultiMail dashboard, create a mailbox for your Open WebUI instance. Set oversight to gated_send so outbound emails require your approval.

3

Configure MCP in Open WebUI

Navigate to Admin Settings > Tools in your Open WebUI instance. Add a new MCP server connection with the MultiMail configuration. Open WebUI primarily uses SSE transport for MCP servers.

json
{
  "mcpServers": {
    "multimail": {
      "command": "npx",
      "args": ["-y", "@multimail/mcp-server"],
      "env": {
        "MULTIMAIL_API_KEY": "mm_live_your_key_here"
      }
    }
  }
}
4

Enable tools in your model config

In Open WebUI's model settings, ensure that tool use is enabled for the model you want to use with MultiMail. Not all models support function calling, so choose one that does (e.g., GPT-4, Claude, or a capable local model).

5

Verify the connection

Start a new chat and ask the AI to list your MultiMail mailboxes. If configured correctly, the email tools will appear in the tool selection and the model will be able to use them.


Available MCP tools

ToolDescriptionExample
send_emailSend an email from your agent mailbox to any recipient.Send a research summary to collaborators from your private chat.
reply_emailReply to a received email, maintaining the thread context.Reply to a team discussion using insights from your local AI.
check_inboxCheck a mailbox for new or unread messages.Check for new emails from your self-hosted chat interface.
read_emailRead the full content of a specific email including headers and body.Read an email and ask your local model to analyze its contents.
create_mailboxCreate a new agent mailbox on your MultiMail account.Create a mailbox for your organization's AI assistant.
list_mailboxesList all mailboxes on your MultiMail account.View available mailboxes before composing an email.
get_threadRetrieve an entire email thread to see the full conversation history.Pull a full email thread for the AI to analyze and summarize.
search_contactsSearch your contact list by name, email, or tags.Find contacts to include in a group email notification.
add_contactAdd a new contact to your MultiMail contact list.Save a new contact's email after receiving their message.
tag_emailTag or categorize an email for organization and filtering.Tag emails by topic for later retrieval and analysis.
list_pendingList all emails currently awaiting human approval.Review pending outbound emails from the admin dashboard.
decide_emailApprove or reject a pending email in the oversight queue.Approve a response email after reviewing it in the web UI.

Usage examples

Open WebUI MCP configuration
json
{
  "mcpServers": {
    "multimail": {
      "command": "npx",
      "args": ["-y", "@multimail/mcp-server"],
      "env": {
        "MULTIMAIL_API_KEY": "mm_live_your_key_here"
      }
    }
  }
}

MCP server configuration for Open WebUI's admin settings.

Docker Compose setup
yaml
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "3000:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
      - MULTIMAIL_API_KEY=mm_live_your_key_here
    volumes:
      - open-webui:/app/backend/data
  ollama:
    image: ollama/ollama
    volumes:
      - ollama:/root/.ollama
volumes:
  open-webui:
  ollama:

Run Open WebUI with Ollama and MCP support via Docker Compose.

Local AI with email capabilities
text
Check my MultiMail inbox for the most recent email. Read it and give me a summary. If it requires a response, draft one and show it to me before sending.

Use a local Ollama model with MultiMail for private AI email.

RAG-enhanced email responses
text
Read the latest support email from my inbox. Search our uploaded knowledge base documents for relevant information, then draft a comprehensive reply that references our documentation.

Combine Open WebUI's RAG pipeline with MultiMail for informed replies.


Best practices

Choose a tool-capable model

Not all LLMs support function calling well. For reliable MCP tool use in Open WebUI, use models known for strong tool support: GPT-4/4o, Claude 3.5+, or local models specifically fine-tuned for function calling. Smaller local models may struggle with MCP tool invocation.

Combine with RAG for informed emails

Open WebUI's RAG pipeline lets you upload documents as knowledge sources. When composing emails, the AI can reference your uploaded documentation, making email responses more accurate and consistent with your organization's information.

Multi-user email access

Open WebUI supports multiple users with role-based access. Admin users can configure MCP servers that are available to all users. Use MultiMail's oversight mode to review emails from any user before they're sent.

Open WebUI primarily uses SSE transport

Open WebUI's MCP support favors SSE transport. If stdio doesn't work directly, you may need to use a stdio-to-SSE bridge or check Open WebUI's latest documentation for updated MCP transport support.


Common questions

Does Open WebUI support stdio or SSE for MCP?
Open WebUI primarily supports SSE transport for MCP servers. Check the latest Open WebUI documentation for stdio support, as the platform is actively evolving. If only SSE is supported, you may need to run the MultiMail MCP server behind an SSE proxy or wait for a future update.
Can I use a local Ollama model with MultiMail?
Yes, but choose your model carefully. The AI model handles the conversation and tool invocation logic, while MultiMail handles the actual email operations. Use a local model with strong function-calling support for reliable MCP tool use. Models like Mistral or Llama variants with tool support work best.
Is my chat data private when using MultiMail?
Your chat conversations stay within your Open WebUI instance (and your local Ollama if using local models). Only the specific email operations — sending, reading, and managing emails — go through MultiMail's API. Your prompts and conversation history are not sent to MultiMail.
Can multiple users share the same MultiMail mailbox?
Yes. The MCP server is configured at the instance level, so all Open WebUI users share the same MultiMail connection. Emails are sent from the configured mailbox regardless of which user initiates the action. Use MultiMail's oversight queue to review all outbound emails.
How do I update the MCP server in Docker?
The npx command fetches the latest @multimail/mcp-server version each time it runs. Restart your Open WebUI container to pick up new versions. If you need a specific version, pin it in the configuration like '@multimail/[email protected]'.

More MCP clients

The only agent email with a verifiable sender

Email infrastructure built for AI agents. Verifiable identity, graduated oversight, and a 38-tool MCP server. Formally verified in Lean 4.