--- license: mit language: - zh - en - de - fr task_categories: - feature-extraction - text-classification tags: - embeddings - sociology - retrieval - sentence-transformers - numpy - qwen3 pretty_name: THETA Embeddings --- # THETA-embeddings Pre-computed embeddings generated by [THETA](https://huggingface.co/CodeSoulco/THETA), a domain-specific embedding model fine-tuned on Qwen3-Embedding for sociology and social science texts. ## Description This dataset contains dense vector embeddings produced under three settings: - **zero_shot:** Embeddings from the base Qwen3-Embedding model without fine-tuning - **supervised:** Embeddings from the LoRA-adapted model trained with label-guided contrastive learning - **unsupervised:** Embeddings from the LoRA-adapted model trained with SimCSE ## Repository Structure ``` CodeSoulco/THETA-embeddings/ ├── 0.6B/ │ ├── zero_shot/ │ ├── supervised/ │ └── unsupervised/ └── 4B/ ├── zero_shot/ ├── supervised/ └── unsupervised/ ``` ## Embedding Details | Model | Dimension | Format | |---|---|---| | Qwen3-Embedding-0.6B | 896 | `.npy` | | Qwen3-Embedding-4B | 2560 | `.npy` | **Source Datasets:** germanCoal, FCPB, socialTwitter, hatespeech, mental_health ## How to Use ```python import numpy as np # Load pre-computed embeddings embeddings = np.load("0.6B/zero_shot/germanCoal_zero_shot_embeddings.npy") print(embeddings.shape) # (num_samples, 896) ``` Or download via `huggingface_hub`: ```python from huggingface_hub import hf_hub_download import numpy as np path = hf_hub_download( repo_id="CodeSoulco/THETA-embeddings", filename="0.6B/supervised/socialTwitter_supervised_embeddings.npy", repo_type="dataset" ) embeddings = np.load(path) ``` ## Related - **Model (LoRA weights):** [CodeSoulco/THETA](https://huggingface.co/CodeSoulco/THETA) ## License This dataset is released under the **MIT License**. ## Citation ```bibtex @misc{theta2026, title={THETA: Textual Hybrid Embedding--based Topic Analysis}, author={CodeSoul}, year={2026}, publisher={Hugging Face}, url={https://huggingface.co/CodeSoulco/THETA} } ```