A newer version of this model is available: FlameF0X/Muffin-2.9b-1C25

Muffin 2.9a-0C17

Overview

Muffin 2.9a-0C17 is an experimental small language model trained on a fully synthetic dataset. This model represents an exploration of alternative training methodologies and synthetic data usage for dialogue generation.

Model Details

  • Architecture: Based on DistilGPT2
  • Training Method: Experimental approach (details forthcoming in future releases)
  • Dataset: Fully synthetic dataset (mentioned previously)
  • Use Case: In casual dialogue systems

Use

Use this:

# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="FlameF0X/Muffin-2.9a-0C17")

Or this

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("FlameF0X/Muffin-2.9a-0C17")
model = AutoModelForCausalLM.from_pretrained("FlameF0X/Muffin-2.9a-0C17")

HF Space Available

I made a space for a better experience.

Community Feedback

If you encounter issues not listed above, please report them in the Community tab. Your feedback is valuable for improving future iterations of the model.

Acknowledgments

Thanks to the community for testing and providing feedback on this experimental model.

Downloads last month
30
Safetensors
Model size
81.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for FlameF0X/Muffin-2.9a-0C17

Finetuned
(694)
this model
Finetunes
3 models
Quantizations
1 model

Dataset used to train FlameF0X/Muffin-2.9a-0C17

Collection including FlameF0X/Muffin-2.9a-0C17