--- title: Flask emoji: 🤖 colorFrom: purple colorTo: purple sdk: docker pinned: false --- # MoA Chat MoA Chat is a multi-agent AI chat platform where several LLMs respond in parallel, and an aggregator model combines their outputs into a single response. ## Features - Python 3 backend using Flask - Frontend with HTML, JavaScript, TailwindCSS (optional) - Simultaneous queries to multiple AI models - Aggregator LLM refines the combined output - Supports light and dark themes - Switch documentation language between English and Spanish - Secret API keys management - Hugging Face Spaces compatibility - Docker support - Free models through OpenRouter and others ## Requirements - Python 3.11+ - pip - API keys for model providers ## Installation 1. Clone the repository: ```bash git clone https://huggingface.co/spaces/UntilDot/Flask cd Flask ``` 2. Create a `.env` file: ```env OPENROUTER_API_KEY=your-openrouter-key TOGETHER_API_KEY=your-together-key GROK_API_KEY=your-grok-key GROQ_API_KEY=your-groq-key ``` 3. Install dependencies: ```bash pip install -r requirements.txt ``` 4. Run the application: ```bash python app.py ``` The server will be available at `http://localhost:7860`. ## Adding New Models Edit the file `llm/model_config.json` to register new models and providers. Example: ```json { "providers": { "openrouter": { "url": "https://openrouter.ai/api/v1/chat/completions", "key_env": "OPENROUTER_API_KEY" } }, "models": { "deepseek/deepseek-chat-v3-0324:free": "openrouter" } } ``` ## Docker Support To build and run using Docker: Create the following `Dockerfile`: ```Dockerfile FROM python:3.11-slim WORKDIR /app COPY requirements.txt ./ RUN pip install --no-cache-dir -r requirements.txt COPY . . EXPOSE 7860 CMD ["python", "app.py"] ``` Then: ```bash docker build -t moa-chat . docker run -d -p 7860:7860 --env-file .env moa-chat ``` Environment variables or an `.env` file are required for API keys. ## License Licensed under the Apache License 2.0.