S-Dreamer's picture
Update README.md
cfd2db8 verified

A newer version of the Gradio SDK is available: 5.27.1

Upgrade
metadata
title: Code Generation with CodeT5
emoji: 😻
colorFrom: yellow
colorTo: green
sdk: gradio
sdk_version: 5.27.0
app_file: app.py
pinned: false
license: mit
hf_oauth: true
hf_oauth_scopes:
  - inference-api
short_description: Leverage CodeT5-base for code generation tasks.
model_info:
  model_name: Salesforce/codet5-base
  model_type: Encoder-Decoder Transformer
  architecture: T5-based
  pretraining_tasks:
    - Denoising
    - Bimodal Dual Generation
  training_data:
    - CodeSearchNet
    - CodeXGLUE
  fine_tuning_tasks:
    - Code Summarization
    - Code Generation
    - Code Translation
  performance_benchmarks:
    - CodeXGLUE
  paper: >-
    CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code
    Understanding and Generation
  publication_date: '2021-09-02'
  arxiv_url: https://arxiv.org/abs/2109.00859
  github_url: https://github.com/salesforce/CodeT5
  huggingface_url: https://huggingface.co/Salesforce/codet5-base

πŸš€ Code Generation with CodeT5

Welcome to the Code Generation with CodeT5 project! This repository demonstrates how to leverage the Salesforce/codet5-base model for generating Python code snippets based on textual prompts. The project utilizes Gradio for creating interactive web interfaces and is deployed on Hugging Face Spaces.

πŸ“š Repository Contents

  • Model Configuration:
    Stored in config.json, this file defines the architecture and settings of the CodeT5 model.

  • Tokenizer Special Tokens:
    Located in special_tokens_map.json, it maps special tokens used during tokenization.

  • Training Hyperparameters:
    Found in training_args.json, this file contains parameters like learning rate, batch size, and number of epochs used during training.

  • Inference Code:
    The app.py script loads the model and provides an interface for code generation.

  • Dependencies:
    Listed in requirements.txt, these are the necessary packages for running the model.

  • Documentation:
    This README.md provides an overview and guide for setting up and using the repository.

πŸ”§ Setup & Usage

1. Clone the Repository

Clone the repository to your local machine:

git clone https://github.com/your-username/codegen-model-repo.git
cd codegen-model-repo

2. Install Dependencies

Install the required packages using pip:

pip install -r requirements.txt

3. Run the Gradio App

Launch the Gradio app to start generating code:

streamlit run app.py

Access the app in your browser to input prompts and receive generated code snippets.

🌐 Deploying on Hugging Face Spaces

To deploy your Gradio app on Hugging Face Spaces:

  1. Create a New Space:

  2. Push Your Code:

    • Initialize a Git repository in your project directory.
    • Commit your code and push it to the new Space's repository.

For a detailed walkthrough on deploying Gradio apps to Hugging Face Spaces, refer to this tutorial.

πŸ“„ License

This project is licensed under the MIT License.