YiSM-34B-0rn

ExLlamav2 3.2 bpw quants of https://huggingface.co/altomek/YiSM-34B-0rn

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for altomek/YiSM-34B-0rn-3.2bpw-EXL2

Quantized
(9)
this model