Update README.md
Browse files
README.md
CHANGED
|
@@ -9,9 +9,9 @@ tags:
|
|
| 9 |
- merge
|
| 10 |
---
|
| 11 |
|
| 12 |
-
**
|
| 13 |
|
| 14 |
-
**
|
| 15 |
|
| 16 |
For more detail. please see our [blog](https://arxiv.org/abs/2412.13702). The paper is coming soon.
|
| 17 |
|
|
@@ -22,14 +22,14 @@ For more detail. please see our [blog](https://arxiv.org/abs/2412.13702). The pa
|
|
| 22 |
**Reasoning Performance**
|
| 23 |
|
| 24 |
<div align="center">
|
| 25 |
-
<img src="https://storage.googleapis.com/typhoon-public/assets/typhoon2-text/r1_reasoning.jpg" alt="
|
| 26 |
</div>
|
| 27 |
|
| 28 |
|
| 29 |
**General Instruction-Following Performance**
|
| 30 |
|
| 31 |
<div align="center">
|
| 32 |
-
<img src="https://storage.googleapis.com/typhoon-public/assets/typhoon2-text/r1_general.jpg" alt="
|
| 33 |
</div>
|
| 34 |
|
| 35 |
|
|
|
|
| 9 |
- merge
|
| 10 |
---
|
| 11 |
|
| 12 |
+
**Typhoon2-DeepSeek-R1-70B**: Thai reasoning large language model.
|
| 13 |
|
| 14 |
+
**Typhoon2-DeepSeek-R1-70B** is a Thai 🇹🇭 reasoning large language model with 70 billion parameters, and it is based on DeepSeek R1 70B Distill.
|
| 15 |
|
| 16 |
For more detail. please see our [blog](https://arxiv.org/abs/2412.13702). The paper is coming soon.
|
| 17 |
|
|
|
|
| 22 |
**Reasoning Performance**
|
| 23 |
|
| 24 |
<div align="center">
|
| 25 |
+
<img src="https://storage.googleapis.com/typhoon-public/assets/typhoon2-text/r1_reasoning.jpg" alt="Typhoon2 DeepSeek R1 70B Reasoning Performance" width="100%" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
|
| 26 |
</div>
|
| 27 |
|
| 28 |
|
| 29 |
**General Instruction-Following Performance**
|
| 30 |
|
| 31 |
<div align="center">
|
| 32 |
+
<img src="https://storage.googleapis.com/typhoon-public/assets/typhoon2-text/r1_general.jpg" alt="Typhoon2 DeepSeek R1 70B General Performance" width="100%" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
|
| 33 |
</div>
|
| 34 |
|
| 35 |
|