Maximofn's picture
feat(ENDPOINT): :pushpin: Add new summarize endpoint
664d175
|
raw
history blame
1.62 kB
metadata
title: IriusRiskTestChallenge
emoji: 🚀
colorFrom: green
colorTo: indigo
sdk: docker
pinned: false
license: apache-2.0
short_description: LLM backend for IriusRisk Tech challenge

IriusRisk test challenge

This project implements a FastAPI API that uses LangChain and LangGraph to generate text with the SmolLM2-1.7B-Instruct model from HuggingFace.

Configuration

In HuggingFace Spaces

This project is designed to run in HuggingFace Spaces. To configure it:

  1. Create a new Space in HuggingFace with SDK Docker

Local development

For local development:

  1. Clone this repository
  2. Install the dependencies:
    pip install -r requirements.txt
    

Local execution

uvicorn app:app --reload

The API will be available at http://localhost:7860.

Endpoints

GET /

Welcome endpoint that returns a greeting message.

POST /generate

Endpoint to generate text using the language model.

POST /summarize

Endpoint to summarize text using the language model.

Request parameters:

{
  "query": "Your question here",
  "thread_id": "optional_thread_identifier"
}

Response:

{
  "generated_text": "Generated text by the model",
  "thread_id": "thread identifier"
}

Docker

To run the application in a Docker container:

# Build the image
docker build -t iriusrisk-test-challenge .

# Run the container
docker run -p 7860:7860 iriusrisk-test-challenge

API documentation

The interactive API documentation is available at:

  • Swagger UI: http://localhost:7860/docs
  • ReDoc: http://localhost:7860/redoc