MoA Chat - Documentation
π What is MoA Chat?
MoA Chat is a simple but powerful chat platform where multiple AI models answer the same question at the same time, and an aggregator model combines their outputs into one final answer.
- Built in Python 3.
- Web framework: Flask.
- Frontend: HTML, JavaScript, TailwindCSS (optionally removable).
- Designed to work first on Hugging Face Spaces, but can also be self-hosted.

βοΈ Features
- Send your question once β multiple AI models answer simultaneously.
- Aggregator model (LLM-D) summarizes all responses.
- Fully configurable: choose which models you want to use.
- Modern minimal UI with light/dark theme toggle.
- Spanish/English documentation switch.
- Free models supported through OpenRouter and others.
- No API keys exposed in the frontend (safe backend request).
π οΈ Self Hosting
You can clone the project like this (includes the Dockerfile for containerization):
git clone https://huggingface.co/spaces/UntilDot/Flask
Requirements:
- Python 3.11+
- Pip
- Create a
.env
file and add your API keys.
Install dependencies:
pip install -r requirements.txt
Run locally:
python app.py
Default port is 7860 (to match Hugging Face standard).
π³ Docker support:
Your repository includes a Dockerfile
for easy containerization when you clone it. Here's the content:
# Use a slim Python base
FROM python:3.11-slim
# Set working directory
WORKDIR /app
# Install dependencies
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
# Copy source code
COPY . .
# Expose the default Hugging Face Spaces port
EXPOSE 7860
# Run the app
CMD ["python", "app.py"]
Alternatively, you can manually create your own Dockerfile
with the above content if you prefer to customize it.
To build and run the Docker container after cloning the repository:
docker build -t moa-chat .
docker run -d -p 7860:7860 --env-file .env moa-chat
Docker will NOT automatically inject secrets unless you:
- Use a
.env
file with--env-file .env
- Manually use
-e VAR=VALUE
flags indocker run
π Environment Variables (Secrets)
To use this app, you must set your API keys in secrets or environment variables.
Follow this syntax:
OPENROUTER_API_KEY=your-openrouter-key
TOGETHER_API_KEY=your-together-key
GROK_API_KEY=your-grok-key
GROQ_API_KEY=your-groq-key
You can set these in:
- Hugging Face Secrets section (recommended if on Spaces)
.env
file (only for self-hosting)
π§© How to Add More Models
All models and providers are declared inside:
llm/model_config.json
The structure looks like this:
{
"providers": {
"openrouter": {
"url": "https://openrouter.ai/api/v1/chat/completions",
"key_env": "OPENROUTER_API_KEY"
}
},
"models": {
"deepseek/deepseek-chat-v3-0324:free": "openrouter"
}
}
To add a new model:
- Find the right provider (OpenRouter, Together, Grok, Groq, etc).
- Add its endpoint URL under "providers" if not already listed.
- Add your model name under "models" section, linking it to the provider.
Make sure your environment variables (secrets) are correctly configured.
π·οΈ Licensing
This project is licensed under Apache 2.0 β You are free to use, modify, and distribute, even commercially.