Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated from
TamisAI/inference-lamp-api
TamisAI
/
inference-api-g1
like
0
Running
on
CPU Upgrade
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
df3bf97
inference-api-g1
Ctrl+K
Ctrl+K
1 contributor
History:
64 commits
alexfremont
Add enhanced Gradio UI with tabs, table view and model details formatting
df3bf97
10 days ago
api
Refactor model loading to store metadata alongside pipelines in model_pipelines dict
10 days ago
architecture
Refactor API architecture with modular design and database integration
14 days ago
config
Add model management endpoints and database fetch functionality
12 days ago
db
Refactor model loading to store metadata alongside pipelines in model_pipelines dict
10 days ago
models
Refactor model loading to store metadata alongside pipelines in model_pipelines dict
10 days ago
schemas
Refactor API architecture with modular design and database integration
14 days ago
steps
Refactor API architecture with modular design and database integration
14 days ago
utils
Refactor API architecture with modular design and database integration
14 days ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
.gitignore
Safe
347 Bytes
first commit for API
7 months ago
Dockerfile
Safe
832 Bytes
Merge Gradio UI into FastAPI app and standardize port to 7860
14 days ago
README.md
Safe
276 Bytes
Update README.md
22 days ago
docker-compose.yml
Safe
205 Bytes
Merge Gradio UI into FastAPI app and standardize port to 7860
14 days ago
main.py
Safe
8.24 kB
Add enhanced Gradio UI with tabs, table view and model details formatting
10 days ago
requirements.txt
Safe
228 Bytes
Replace gradio import syntax and remove unnecessary whitespace
10 days ago