File size: 2,873 Bytes
f34e374 360970f f34e374 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 | ---
language:
- en
---
## `faizack/kronos-small-custom`
Custom Hugging Face model repo for deploying the **Kronos-small** time-series forecasting model as an Inference Endpoint.
This folder is structured so you can **zip it or `git init` it and push directly** to Hugging Face under your account [`faizack`](https://huggingface.co/faizack).
### 1. Files expected in this repo
You will need the following files in the root of the Hugging Face repo:
- `config.json` – copied or downloaded from `NeoQuasar/Kronos-small`
- `model.safetensors` – weights from `NeoQuasar/Kronos-small`
- `model.py` – Kronos model definition (from the official GitHub repo)
- `tokenizer.py` – Kronos tokenizer implementation
- `predictor.py` – `KronosPredictor` wrapper
- `inference.py` – entrypoint used by Hugging Face Inference Endpoints (already provided here)
- `requirements.txt` – Python dependencies (already provided here)
This folder currently includes:
- `README.md` (this file)
- `inference.py`
- `requirements.txt`
- `.env.example`
- `.gitattributes` (Git LFS for safetensors)
- `.gitignore`
You still need to add:
- `config.json`
- `model.safetensors`
- `model.py`
- `tokenizer.py`
- `predictor.py`
### 2. How to prepare and push to Hugging Face
From this folder:
```bash
cd kronos-small-custom
# (optional) initialize git
git init
git lfs install
# Log in to Hugging Face
huggingface-cli login
# Create the remote repo under your account
huggingface-cli repo create faizack/kronos-small-custom --type model
# Add HF remote
git remote add origin https://huggingface.co/faizack/kronos-small-custom
```
Now copy in the Kronos implementation and weights:
1. From the official Kronos GitHub repo, copy:
- `model.py`
- `tokenizer.py`
- `predictor.py`
2. From `NeoQuasar/Kronos-small` on Hugging Face, download:
- `config.json`
- `model.safetensors`
Then commit and push:
```bash
git add .
git commit -m "Initial Kronos-small custom deployment"
git push -u origin main
```
### 3. Inference contract
`inference.py` exposes a `predict(request)` function that Hugging Face Inference Endpoints will call.
Expected JSON body:
```json
{
"inputs": {
"df": [
{"open": 1.0, "high": 1.1, "low": 0.9, "close": 1.05},
{"open": 1.05, "high": 1.12, "low": 1.0, "close": 1.08}
],
"x_timestamp": ["2024-01-01T00:00:00Z", "2024-01-01T01:00:00Z"],
"y_timestamp": ["2024-01-01T02:00:00Z", "2024-01-01T03:00:00Z"],
"pred_len": 2,
"T": 1.0,
"top_p": 0.9,
"sample_count": 1
}
}
```
Response structure:
```json
{
"predictions": [
{
"open": ...,
"high": ...,
"low": ...,
"close": ...
},
{
"open": ...,
"high": ...,
"low": ...,
"close": ...
}
]
}
```
You can adapt this contract as needed, as long as `predict` returns JSON-serializable data. |