Spaces:
Sleeping
Sleeping
metadata
title: IriusRiskTestChallenge
emoji: 🚀
colorFrom: green
colorTo: indigo
sdk: docker
pinned: false
license: apache-2.0
short_description: LLM backend for IriusRisk Tech challenge
IriusRisk test challenge
This project implements a FastAPI API that uses LangChain and LangGraph to generate text with the SmolLM2-1.7B-Instruct model from HuggingFace.
Configuration
In HuggingFace Spaces
This project is designed to run in HuggingFace Spaces. To configure it:
- Create a new Space in HuggingFace with SDK Docker
Local development
For local development:
- Clone this repository
- Install the dependencies:
pip install -r requirements.txt
Local execution
uvicorn app:app --reload
The API will be available at http://localhost:7860
.
Endpoints
GET /
Welcome endpoint that returns a greeting message.
POST /generate
Endpoint to generate text using the language model.
POST /summarize
Endpoint to summarize text using the language model.
Request parameters:
{
"query": "Your question here",
"thread_id": "optional_thread_identifier"
}
Response:
{
"generated_text": "Generated text by the model",
"thread_id": "thread identifier"
}
Docker
To run the application in a Docker container:
# Build the image
docker build -t iriusrisk-test-challenge .
# Run the container
docker run -p 7860:7860 iriusrisk-test-challenge
API documentation
The interactive API documentation is available at:
- Swagger UI:
http://localhost:7860/docs
- ReDoc:
http://localhost:7860/redoc