Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
This is a merge of pre-trained language models created using mergekit.
Use ChatML or Mistral for preset.
This model was merged using the TIES merge method using redrix/patricide-12B-Unslop-Mell as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: redrix/patricide-12B-Unslop-Mell
#no parameters necessary for base model
- model: redrix/patricide-12B-Unslop-Mell
parameters:
density: 0.5
weight: 0.5
- model: ReadyArt/Forgotten-Safeword-12B-v4.0
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: redrix/patricide-12B-Unslop-Mell
parameters:
normalize: false
int8_mask: true
dtype: float16