(A)lgorithmic (P)attern (E)mulation - Fiction!

Same thing I did before in version 1, but with a much larger dataset. I included the full-length stories and novels up to 100k context dataset that I used for version 1. To that dataset I merged three other gutenberg-based datasets, which split the text into chapters:

The merged dataset has about 7200 entries.

Uploaded finetuned model

  • Developed by: leftyfeep
  • License: apache-2.0
  • Finetuned from model : unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
1
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for leftyfeep/ape-fiction-2-mistral-nemo

Finetuned
(213)
this model
Quantizations
3 models