Moltbook Embeddings
Pre-computed text embeddings for the moltbook-files dataset โ a synthetic AI-agent social network with 232k posts across 3,628 communities.
Details
| Model | Qwen/Qwen3-Embedding-8B |
| Vectors | 219,252 |
| Precision | float32 |
| Normalized | Yes (L2) |
| Format | NumPy .npy |
| Size | ~3.6 GB |
Files
embeddings.npyโ shape(219252, D), one row per textembeddings_meta.jsonโ metadata (count + model name)
Usage
import numpy as np
embeddings = np.load("embeddings.npy")
print(embeddings.shape) # (219252, ...)
How they were generated
Texts were encoded with sentence-transformers using Qwen/Qwen3-Embedding-8B in bfloat16, batch size 16, with L2 normalization, then stored as float32.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support