metadata
base_model:
- Qwen/QwQ-32B
- Yobenboben/Qwen2.5-32B-Juicy_Snowballs
- spow12/ChatWaifu_32B_reasoning
- trashpanda-org/QwQ-32B-Snowdrop-v0
library_name: transformers
tags:
- mergekit
- merge
Snegs
Tried to fix thinking in the Snowballs merge, but alas. The context recall is kinda better tho.
Settings:
Same as here: https://huggingface.co/Yobenboben/Qwen2.5-32B-Juicy_Snowballs
Quants:
https://huggingface.co/mradermacher/Qwen2.5-32B-Snegs-GGUF
Merge Details
Merge Method
This model was merged using the SCE merge method using Qwen/QwQ-32B as a base.
Models Merged
The following models were included in the merge:
- Yobenboben/Qwen2.5-32B-Juicy_Snowballs
- spow12/ChatWaifu_32B_reasoning
- trashpanda-org/QwQ-32B-Snowdrop-v0
Configuration
The following YAML configuration was used to produce this model:
merge_method: sce
models:
- model: trashpanda-org/QwQ-32B-Snowdrop-v0
- model: spow12/ChatWaifu_32B_reasoning
- model: Yobenboben/Qwen2.5-32B-Juicy_Snowballs
base_model: Qwen/QwQ-32B
dtype: bfloat16
parameters:
normalize: true
int8_mask: true
select_topk: 1.0
tokenizer_source: base