Upload folder using huggingface_hub
Browse files
README.md
CHANGED
|
@@ -12,7 +12,7 @@ tags:
|
|
| 12 |
# Microsoft Phi-Ground-4B-7C
|
| 13 |
|
| 14 |
<p align="center">
|
| 15 |
-
<a href="https://
|
| 16 |
</p>
|
| 17 |
|
| 18 |

|
|
@@ -29,7 +29,7 @@ tags:
|
|
| 29 |

|
| 30 |
|
| 31 |
### Usage
|
| 32 |
-
|
| 33 |
|
| 34 |
Examples of required packages:
|
| 35 |
```
|
|
@@ -88,4 +88,4 @@ image = process_image(Image.open(image_path))
|
|
| 88 |
```
|
| 89 |
|
| 90 |
|
| 91 |
-
Then you can use huggingface model or [vllm](https://github.com/vllm-project/vllm) to inference. End-to-end examples and benchmark results reproduction
|
|
|
|
| 12 |
# Microsoft Phi-Ground-4B-7C
|
| 13 |
|
| 14 |
<p align="center">
|
| 15 |
+
<a href="https://microsoft.github.io/Phi-Ground/" target="_blank">π€ HomePage</a> | <a href="https://huggingface.co/papers/2507.23779" target="_blank">π Paper </a> | <a href="https://arxiv.org/abs/2507.23779" target="_blank">π Arxiv </a> | <a href="https://huggingface.co/microsoft/Phi-Ground" target="_blank"> π Model </a> | <a href="https://github.com/microsoft/Phi-Ground/tree/main/benchmark/new_annotations" target="_blank"> π Eval data </a>
|
| 16 |
</p>
|
| 17 |
|
| 18 |

|
|
|
|
| 29 |

|
| 30 |
|
| 31 |
### Usage
|
| 32 |
+
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
| 33 |
|
| 34 |
Examples of required packages:
|
| 35 |
```
|
|
|
|
| 88 |
```
|
| 89 |
|
| 90 |
|
| 91 |
+
Then you can use huggingface model or [vllm](https://github.com/vllm-project/vllm) to inference. We also provide [End-to-end examples](https://github.com/microsoft/Phi-Ground/tree/main/examples/call_example.py) and [benchmark results reproduction](https://github.com/microsoft/Phi-Ground/tree/main/benchmark/test_sspro.sh).
|