--- base_model: - ReadyArt/Forgotten-Safeword-24B-3.6 - Trappu/Picaro-Apparatus-ties0.7-24b - Sorawiz/MistralSmall-Creative-24B-Realist library_name: transformers tags: - mergekit - merge --- # Chat Template Mistral Instruct ``` {{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user {{ .Prompt }}<|im_end|> {{ end }}<|im_start|>assistant {{ .Response }}<|im_end|> ``` ChatML ``` {{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user {{ .Prompt }}<|im_end|> {{ end }}<|im_start|>assistant {{ .Response }}{{ if .Response }}<|im_end|>{{ end }} ``` # GGUF * Q6_K quant - [Sorawiz/MistralSmall-Creative-24B-Q6_K-GGUF](https://huggingface.co/Sorawiz/MistralSmall-Creative-24B-Q6_K-GGUF) # Merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [ReadyArt/Forgotten-Safeword-24B-3.6](https://huggingface.co/ReadyArt/Forgotten-Safeword-24B-3.6) as a base. ### Models Merged The following models were included in the merge: * [Trappu/Picaro-Apparatus-ties0.7-24b](https://huggingface.co/Trappu/Picaro-Apparatus-ties0.7-24b) * [Sorawiz/MistralSmall-Creative-24B-Realist](https://huggingface.co/Sorawiz/MistralSmall-Creative-24B-Realist) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: dare_ties base_model: ReadyArt/Forgotten-Safeword-24B-3.6 models: - model: ReadyArt/Forgotten-Safeword-24B-3.6 parameters: weight: 0.2 - model: Sorawiz/MistralSmall-Creative-24B-Realist parameters: weight: 0.4 - model: Trappu/Picaro-Apparatus-ties0.7-24b parameters: weight: 0.4 parameters: density: 0.8 tokenizer: source: union chat_template: auto ```