Spaces:
Running
Running
File size: 1,427 Bytes
377401d 31ad8f0 e9d0ab4 8350241 e9d0ab4 8350241 cbef3f0 8350241 b14e985 1ae4a51 31ad8f0 1ae4a51 b14e985 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
---
title: 🔬🧠ScienceBrain.AI-8model-GPT-4o
emoji: 🔬
colorFrom: red
colorTo: green
sdk: streamlit
sdk_version: 1.30.0
app_file: app.py
pinned: true
license: mit
---
This experimental multi agent mixture of expert system uses a variety of techniques and models to create different combinatorial AI solutions.
Models Used:
1. Mistral-7B-Instruct
2. Llama2-7B
3. Mixtral-8x7B-Instruct
4. Google Gemma-7B
5. OpenAI Whisper Small En
6. OpenAI GPT-4o, Whisper-1
7. ArXiV Embeddings
The techniques below which are not ML models but AI include:
1. Speech Synthesis using browser technology
2. Memory for semantic facts, and episodic emotional and event time series memories
3. Web integration using the q= standard for search linking allowing comparison of tech giant AI implementations:
4. Bing then Bing copilot with click 2
5. Google which does an AI search now
6. Twitter, the new home for technology discoveries, AI Output and Grok
7. Wikipedia for fact checking
8. YouTube
9. File and metadata integration combining text, audio, image, and video
This app also merges common theories in cognitive AI, AI with python libraries (e.g. NLTK, SKLearn).
The intent is to demonstrate SOTA AI/ML and combinations of Function-Input-Output for interoperability and knowledge management.
This space also serves as an experimental test bed for new technologies mixing it in with old for comparison and integration.
--Aaron
|