A copy of only tokenizer from https://huggingface.co/meta-llama/Llama-2-7b-hf
Le code example
from transformers import LlamaTokenizerFast
tokenizer = LlamaTokenizerFast.from_pretrained("dinhanhx/llama-tokenizer-hf")
text = "Do bạch kim rất quý nên sẽ dùng để lắp vô xương"
print(tokenizer.convert_ids_to_tokens(tokenizer.encode(text)))
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support