Update README.md
Browse files
README.md
CHANGED
|
@@ -118,6 +118,13 @@ model-index:
|
|
| 118 |
MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
|
| 119 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
| 120 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 121 |
### 🦒 Colab
|
| 122 |
| Link | Info - Model Name |
|
| 123 |
| --- | --- |
|
|
|
|
| 118 |
MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
|
| 119 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
| 120 |
|
| 121 |
+
### Prompt Template
|
| 122 |
+
```
|
| 123 |
+
### Instruction:
|
| 124 |
+
<prompt> (without the <>)
|
| 125 |
+
### Response:
|
| 126 |
+
```
|
| 127 |
+
|
| 128 |
### 🦒 Colab
|
| 129 |
| Link | Info - Model Name |
|
| 130 |
| --- | --- |
|