--- base_model: - Delta-Vector/Shimamura-70B - Delta-Vector/Austral-70B-Winton library_name: transformers tags: - mergekit - merge - roleplay - creative_writing - llama ---
A simple merge yet sovl in it's own way, This merge is inbetween Shimamura & Austral Winton, I wanted to give Austral a bit of shorter prose, So FYI for all the 10000+ Token reply lovers.
Thanks Auri for testing!
Using the Oh-so-great 0.2 Slerp merge weight with Winton as the Base.
Support me on Ko-Fi: https://ko-fi.com/deltavector
Model has been tuned with the LLama-3 Instruct formatting.
https://files.catbox.moe/yw81rn.yml
Thank you to Lucy Knada, Auri, Ateron, Alicat, Intervitens, Cgato, Kubernetes Bad and the rest of Anthracite.