SLM-Chatbot / README.md
adv-11's picture
Update README.md
34187cd verified

A newer version of the Gradio SDK is available: 5.29.0

Upgrade
metadata
title: Gradio Chatbot
emoji: πŸš€
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.0.1
app_file: app.py
pinned: true
short_description: Chatbot

Gradio Chatbot : HuggingFace SLMs

A modular Gradio-based application for interacting with various small language models through the Hugging Face API.

Project Structure

slm-poc/
β”œβ”€β”€ main.py                 # Main application entry point
β”œβ”€β”€ modules/
β”‚   β”œβ”€β”€ __init__.py         # Package initialization
β”‚   β”œβ”€β”€ config.py           # Configuration settings and constants
β”‚   β”œβ”€β”€ document_processor.py  # Document handling and processing
β”‚   └── model_handler.py    # Model interaction and response generation
β”œβ”€β”€ Dockerfile              # Docker configuration
β”œβ”€β”€ requirements.txt        # Python dependencies
└── README.md               # Project documentation

Features

  • Interactive chat interface with multiple language model options
  • Document processing (PDF, DOCX, TXT) for question answering
  • Adjustable model parameters (temperature, top_p, max_length)
  • Streaming responses for better user experience
  • Docker support for easy deployment

Setup and Running

Local Development

  1. Clone the repository
  2. Install dependencies:
    pip install -r requirements.txt
    
  3. Create a .env file with your HuggingFace API token:
    HF_TOKEN=hf_your_token_here
    
  4. Run the application:
    python main.py
    

Docker Deployment

  1. Build the Docker image:
    docker build -t slm-poc .
    
  2. Run the container:
    docker run -p 7860:7860 -e HF_TOKEN=hf_your_token_here slm-poc
    

Usage

  1. Access the web interface at http://localhost:7860
  2. Enter your HuggingFace API token if not provided via environment variables
  3. Select your preferred model and adjust parameters
  4. Start chatting with the model
  5. Optionally upload documents for document-based Q&A

Supported Models

T2T Inference models provided by Hugging Face via the Inference API

License

This project is licensed under the MIT License - see the LICENSE file for details.