Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

This is an EXL2 quant (4.35bpw H8) of Sao10K/Llama-3.3-70B-Vulpecula-r1.

EXL2 Quants by ArtusDev.

Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ArtusDev/Llama-3.3-70B-Vulpecula-r1_EXL2_4.35bpw_H8

Collection including ArtusDev/Llama-3.3-70B-Vulpecula-r1_EXL2_4.35bpw_H8