Pixel dust in the footsteps of an old bot WARN: Loneliness == true Patch: Hugs available 
Circuits in flowers compose old songs Error 0xDEADBE: nostalgia overflow Flush the dusk away 

merged

πŸ›‘ Premise

this is my first merge I hope it doesn't suck too much, Any feedback and/or help is appreciated

πŸ“– Model Overview

Kitsune-Symphony-V0.0-12B is a merged large language model created using mergekit. The Linear DELLA merge method was used, with IntervitensInc_Mistral-Nemo-Base-2407-chatml as a base.

Data

  • well... i didn't add any new data so nothing, I just merged some models I liked

πŸš€ Intended Use

🦊 Kitsune-Symphony-V0.0-12B aims to deliver:

  • Roleplay with emotional depth and consistent personas

  • Flexible instruction-following

  • Casual conversation with creative flair

  • Storytelling in fantasy, romance, and slice-of-life genres

😷 Ethical Containment

This model contains:

  • ⚠️ Generating unfiltered creative content
  • ⚠️ Producing potentially disturbing narratives
  • ⚠️ Creating NSFW content

Use with discretion.

βš–οΈ License

Follow the licensing terms of each merged model:

Each source model’s license applies β€” please review them before use.

GGUF

Static GGUF quant can be found HERE

Models Merged

The following models were included in the merge:

  • spow12/ChatWaifu_12B_v2.0
  • inflatebot/MN-12B-Mag-Mell-R1
  • Retreatcost/KansenSakura-Zero-RP-12b
  • Entropicengine/Pinecone-Rune-12b

Configuration

The following YAML configuration was used to produce this model:

base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
chat_template: chatml
dtype: bfloat16
merge_method: della_linear
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 40]
        model: Retreatcost/KansenSakura-Zero-RP-12b
        parameters:
          density: 0.3
          weight: 0.25
      - layer_range: [0, 40]
        model: Entropicengine/Pinecone-Rune-12b
        parameters:
          density: 0.2
          weight: 0.25
      - layer_range: [0, 40]
        model: inflatebot/MN-12B-Mag-Mell-R1
        parameters:
          density: 0.2
          weight: 0.2
      - layer_range: [0, 40]
        model: spow12/ChatWaifu_12B_v2.0
        parameters:
          density: 0.2
          weight: 0.2
      - layer_range: [0, 40]
        model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
tokenizer:
  source: base
Downloads last month
4
Safetensors
Model size
12B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for MrRikyz/Kitsune-Symphony-V0.0-12B