metadata
base_model:
- Tesslate/Tessa-T1-14B
- deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- deepcogito/cogito-v1-preview-qwen-14B
- Qwen/Qwen2.5-Coder-14B-Instruct
- djuna/Q2.5-Veltha-14B-0.5
- prithivMLmods/Galactic-Qwen-14B-Exp2
- spacematt/Qwen2.5-Recursive-Coder-14B-Instruct
- JungZoona/T3Q-qwen2.5-14b-v1.0-e3
- Tesslate/UIGEN-T1.5-14B
- tanliboy/lambda-qwen2.5-14b-dpo-test
- prithivMLmods/Messier-Opus-14B-Elite7
- Cran-May/tempmotacilla-cinerea-0308
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using spacematt/Qwen2.5-Recursive-Coder-14B-Instruct as a base.
Models Merged
The following models were included in the merge:
- Tesslate/Tessa-T1-14B
- deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- deepcogito/cogito-v1-preview-qwen-14B
- Qwen/Qwen2.5-Coder-14B-Instruct
- djuna/Q2.5-Veltha-14B-0.5
- prithivMLmods/Galactic-Qwen-14B-Exp2
- JungZoona/T3Q-qwen2.5-14b-v1.0-e3
- Tesslate/UIGEN-T1.5-14B
- tanliboy/lambda-qwen2.5-14b-dpo-test
- prithivMLmods/Messier-Opus-14B-Elite7
- Cran-May/tempmotacilla-cinerea-0308
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Cran-May/tempmotacilla-cinerea-0308
- model: Qwen/Qwen2.5-Coder-14B-Instruct
- model: deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- model: Tesslate/Tessa-T1-14B
- model: prithivMLmods/Messier-Opus-14B-Elite7
- model: Tesslate/UIGEN-T1.5-14B
- model: deepcogito/cogito-v1-preview-qwen-14B
- model: djuna/Q2.5-Veltha-14B-0.5
- model: prithivMLmods/Galactic-Qwen-14B-Exp2
- model: JungZoona/T3Q-qwen2.5-14b-v1.0-e3
- model: tanliboy/lambda-qwen2.5-14b-dpo-test
- model: spacematt/Qwen2.5-Recursive-Coder-14B-Instruct
merge_method: model_stock
base_model: spacematt/Qwen2.5-Recursive-Coder-14B-Instruct
normalize: true
int8_mask: true
dtype: bfloat16