|
--- |
|
library_name: transformers |
|
tags: |
|
- unsloth |
|
- trl |
|
- sft |
|
license: mit |
|
datasets: |
|
- luisroque/instruct-python-500k |
|
language: |
|
- en |
|
base_model: |
|
- meta-llama/Llama-3.1-8B-Instruct |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# Model Card for Model ID |
|
Instruct model for python code generation. |
|
|
|
## Uses |
|
Using for code editor assistant, code generation |
|
### Direct Use |
|
```python |
|
system_prompt=""" |
|
You are expert Python programmer. You task contians follwing instructions. |
|
You should answer the user questions about python. |
|
Write the python syntax based on the use instructions. |
|
The output format must contain only python codes after ```python phrase. |
|
You must use the the user input vairables in your code as code place holder. |
|
""" |
|
|
|
messages = [ |
|
{'role':'system','content':system_prompt}, |
|
{"role": "user", "content":"Write a dijesktra path finder algorithm in python and execute it." }, |
|
] |
|
inputs = tokenizer.apply_chat_template( |
|
messages, |
|
tokenize = True, |
|
add_generation_prompt = True, # Must add for generation |
|
return_tensors = "pt", |
|
).to("cuda") |
|
|
|
from transformers import TextStreamer |
|
text_streamer = TextStreamer(tokenizer, skip_prompt = True) |
|
_ = model.generate(input_ids = inputs, streamer = text_streamer, max_new_tokens =512, |
|
use_cache = True, temperature = 0.3, min_p = 0.9) |
|
|
|
|
|
|
|
|