metadata
license: apache-2.0
datasets:
- Open-Orca/OpenOrca
- teknium/openhermes
- cognitivecomputations/dolphin
- jondurbin/airoboros-3.1
- unalignment/toxic-dpo-v0.1
- unalignment/spicy-3.1
language:
- en
The flower of Ares.
Fine-tuned on mistralai/Mistral-7B-v0.1...my team and I reformatted many different datasets and included a small amount of private stuff to see how much we could improve mistral.
I spoke to it personally for about an hour, and I believe we need to work on our format for the private dataset a bit more, but other than that, it turned out great. I will be uploading it to open llm evaluations, today.
- Uses Mistral prompt template with chat-instruct.