File size: 5,273 Bytes
0ded64d
 
9d82f9a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0ded64d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
<!DOCTYPE html>
<html lang="en" class="bg0 text-fg0 dark">
<head>
  <meta charset="UTF-8">
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  <title>Documentation - MoA Chat</title>
  <link href="https://cdn.jsdelivr.net/npm/[email protected]/dist/tailwind.min.css" rel="stylesheet">
  <link rel="stylesheet" href="/static/style.css" />
</head>
<body class="bg0 text-fg0 transition-colors duration-300 flex flex-col items-center min-h-screen p-6">

  <header class="w-full flex justify-between items-center max-w-4xl mb-6">
    <h1 class="text-3xl font-bold">MoA Chat Documentation</h1>
    <button id="themeToggle" class="bg-blue px-3 py-1 rounded text-fg0 hover:bg-purple transition">πŸŒ™</button>
  </header>

  <div class="max-w-4xl w-full bg1 rounded p-8 shadow-md">

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">What is MoA Chat?</h2>
      <p class="mb-4 text-lg">
        <strong>MoA (Model of Agents)</strong> is a method that lets multiple AI agents (different LLMs) collaborate to generate a higher-quality response than a single model could.
      </p>
      <p class="mb-4">
        MoA Chat implements a simple version of this architecture by:
      </p>
      <ul class="list-disc ml-6 text-lg">
        <li>Querying several different models (LLM-A, LLM-B, LLM-C) at once.</li>
        <li>Combining their answers using another model (LLM-D, the aggregator).</li>
        <li>Delivering a smart, single, structured reply to the user.</li>
      </ul>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">How MoA Works (Visual)</h2>
      <p class="mb-4 text-center">
        <img src="https://raw.githubusercontent.com/togethercomputer/MoA/main/assets/moa_pipeline.png" alt="MoA Architecture" class="mx-auto rounded shadow-md">
      </p>
      <p class="text-center text-sm opacity-70">Source: Together MoA Architecture Concept</p>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">How to Use</h2>
      <ol class="list-decimal ml-6 text-lg">
        <li>Click βš™οΈ to open the configuration panel.</li>
        <li>Select your preferred models for LLM-A, LLM-B, LLM-C, and Aggregator (LLM-D).</li>
        <li>Type your message in the input box.</li>
        <li>Press <strong>Send</strong>.</li>
        <li>Watch multiple models collaborate for the best response!</li>
      </ol>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">Features</h2>
      <ul class="list-disc ml-6 text-lg">
        <li>Parallel querying of multiple free or premium models (via OpenRouter, Together, Grok, etc.).</li>
        <li>Structured prompts for each model to encourage quality responses.</li>
        <li>Aggregator model intelligently fuses outputs into one reply.</li>
        <li>Dynamic Light/Dark mode (Gruvbox Material Theme).</li>
        <li>Minimal, fast, and secure frontend with no API keys exposed.</li>
      </ul>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">Deployment</h2>
      <p class="mb-4 text-lg">
        MoA Chat is optimized to run on <strong>Hugging Face Spaces</strong> or any platform that supports Python 3.11+, Flask, and Docker-based containers.
      </p>
      <p class="text-lg">
        Requires setting your API keys via Hugging Face's Secrets system (never expose them to the frontend).
      </p>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">File Structure</h2>
      <ul class="list-disc ml-6 text-lg">
        <li><code>app.py</code> β€” Flask backend server.</li>
        <li><code>llm/agents.py</code> β€” Query and aggregation logic for MoA system.</li>
        <li><code>llm/model_config.json</code> β€” Define available models and providers.</li>
        <li><code>templates/</code> β€” Contains <code>index.html</code> and <code>docs.html</code>.</li>
        <li><code>static/</code> β€” Contains <code>style.css</code> and <code>script.js</code>.</li>
      </ul>
    </section>

    <section class="mb-10">
      <h2 class="text-2xl font-semibold mb-4 text-blue">Credits</h2>
      <p class="text-lg">Made with ❀️ in PanamÑ by Until Dot. Inspired by Together's MoA architecture.</p>
    </section>

  </div>

  <div class="text-center mt-8">
    <a href="/" class="px-6 py-2 bg-blue text-fg0 rounded hover:bg-purple transition text-lg font-semibold">← Back to Chat</a>
  </div>

  <script>
    const themeToggle = document.getElementById("themeToggle");
    const html = document.documentElement;
    themeToggle.addEventListener("click", () => {
      const isDark = html.classList.toggle("dark");
      localStorage.setItem("theme", isDark ? "dark" : "light");
      themeToggle.textContent = isDark ? "πŸŒ™" : "β˜€οΈ";
    });

    window.addEventListener("load", () => {
      const savedTheme = localStorage.getItem("theme");
      if (savedTheme === "dark") {
        html.classList.add("dark");
        themeToggle.textContent = "πŸŒ™";
      } else {
        html.classList.remove("dark");
        themeToggle.textContent = "β˜€οΈ";
      }
    });
  </script>

</body>
</html>