YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

This model is a merge of LLAMA-13b and SuperCOT LoRA

huggyllama/llama-13b + kaiokendev/SuperCOT-LoRA/13b/gpu/cutoff-2048

CUDA_VISIBLE_DEVICES=0 python llama.py c4 --wbits 4 --true-sequential --act-order --groupsize 128

In ooba make sure to use --groupsize 128 --wbits 4

Downloads last month
16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support