Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated from
TamisAI/inference-lamp-api
TamisAI
/
inference-api-g1
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
main
inference-api-g1
/
api
Ctrl+K
Ctrl+K
1 contributor
History:
18 commits
alexfremont
Improve model unloading with explicit GPU memory cleanup and CUDA cache clearing
8a8fe7c
6 days ago
__init__.py
Safe
0 Bytes
Refactor API architecture with modular design and database integration
10 days ago
dependencies.py
Safe
2 kB
Clean up imports and remove unused code across API modules
8 days ago
management.py
Safe
6.94 kB
Improve model unloading with explicit GPU memory cleanup and CUDA cache clearing
6 days ago
prediction.py
Safe
2.58 kB
Refactor API auth and add management endpoints for model loading/updating
8 days ago
router.py
Safe
705 Bytes
Move API key middleware to main.py and add startup/shutdown lifecycle management
8 days ago