merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
[THIS MODEL IS A PROTOTYPE]
Merge Method
This model was merged using the Model Stock merge method using D1rtyB1rd/Dirty-Alice-RP-NSFW-llama-3.2-1B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: D1rtyB1rd/Dirty-Alice-RP-NSFW-llama-3.2-1B
weight: 0.9
layer_range: [96, 128]
- model: marcuscedricridia/badllama3.2-1B
weight: 0.1
layer_range: [96, 128]
- model: Novaciano/Dirty_RP-3.2-1B
weight: 0.1
layer_range: [96, 128]
merge_method: model_stock
base_model: D1rtyB1rd/Dirty-Alice-RP-NSFW-llama-3.2-1B
dtype: bfloat16
parameters:
t: [0, 0.5, 1, 0.5, 0]
- Downloads last month
- 21
Model tree for Novaciano/Bad_Alice-RP-3.2-1B
Merge model
this model