Malaysian Finetuned Instruct LoRA Collection Continue finetuning Instruct model using LoRA from 0.5B up to 72B. • 18 items • Updated Jun 24 • 1
MaLLaM 🌙 Collection Pretrain from scratch 4096 context length on 90B tokens Malaysian text, https://huggingface.co/papers/2401.14680 • 10 items • Updated Jun 24 • 15
Multi-Lingual Malaysian Embedding: Leveraging Large Language Models for Semantic Representations Paper • 2402.03053 • Published Feb 5, 2024 • 2
Large Malaysian Language Model Based on Mistral for Enhanced Local Language Understanding Paper • 2401.13565 • Published Jan 24, 2024 • 4