--- base_model: - meta-llama/Llama-3.3-70B-Instruct - NeverSleep/Lumimaid-v0.2-70B - Sao10K/L3.1-70B-Hanami-x1 - SicariusSicariiStuff/Negative_LLAMA_70B - Undi95/Sushi-v1.4 - ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Breadcrumbs with TIES](https://arxiv.org/abs/2312.06795) merge method using [meta-llama/Llama-3.3-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct) as a base. ### Models Merged The following models were included in the merge: * [NeverSleep/Lumimaid-v0.2-70B](https://huggingface.co/NeverSleep/Lumimaid-v0.2-70B) * [Sao10K/L3.1-70B-Hanami-x1](https://huggingface.co/Sao10K/L3.1-70B-Hanami-x1) * [SicariusSicariiStuff/Negative_LLAMA_70B](https://huggingface.co/SicariusSicariiStuff/Negative_LLAMA_70B) * [Undi95/Sushi-v1.4](https://huggingface.co/Undi95/Sushi-v1.4) * [ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4](https://huggingface.co/ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Undi95/Sushi-v1.4 parameters: weight: 0.20 density: 0.9 - model: NeverSleep/Lumimaid-v0.2-70B parameters: weight: 0.20 density: 0.9 - model: ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4 parameters: weight: 0.20 density: 0.9 - model: Sao10K/L3.1-70B-Hanami-x1 parameters: weight: 0.20 density: 0.9 - model: SicariusSicariiStuff/Negative_LLAMA_70B parameters: weight: 0.20 density: 0.9 merge_method: breadcrumbs_ties base_model: meta-llama/Llama-3.3-70B-Instruct parameters: gamma: 0.01 lambda: 1.1 normalize: false int8_mask: true out_dtype: bfloat16 tokenizer: source: SicariusSicariiStuff/Negative_LLAMA_70B ```