Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
dinho1597
/
phi-2-telecom-ft
like
0
Sentence Similarity
sentence-transformers
Safetensors
dinho1597/Telecom-QA-MultipleChoice
phi
feature-extraction
Generated from Trainer
dataset_size:6552
loss:MultipleNegativesRankingLoss
Eval Results
arxiv:
1908.10084
arxiv:
1705.00652
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
phi-2-telecom-ft
11.5 GB
1 contributor
History:
5 commits
dinho1597
Upload tokenizer
12e8d79
verified
12 months ago
1_Pooling
Subiendo modelo inicial
12 months ago
.gitattributes
1.52 kB
initial commit
12 months ago
README.md
22.3 kB
Subiendo modelo inicial
12 months ago
added_tokens.json
1.08 kB
Upload tokenizer
12 months ago
config.json
735 Bytes
Upload PhiForCausalLM
12 months ago
config_sentence_transformers.json
205 Bytes
Subiendo modelo inicial
12 months ago
generation_config.json
119 Bytes
Upload PhiForCausalLM
12 months ago
merges.txt
456 kB
Upload tokenizer
12 months ago
model-00001-of-00003.safetensors
4.98 GB
xet
Upload PhiForCausalLM
12 months ago
model-00002-of-00003.safetensors
4.98 GB
xet
Upload PhiForCausalLM
12 months ago
model-00003-of-00003.safetensors
1.15 GB
xet
Upload PhiForCausalLM
12 months ago
model.safetensors
133 MB
xet
Subiendo modelo inicial
12 months ago
model.safetensors.index.json
35.7 kB
Upload PhiForCausalLM
12 months ago
modules.json
349 Bytes
Subiendo modelo inicial
12 months ago
optimizer.pt
266 MB
xet
Subiendo modelo inicial
12 months ago
rng_state.pth
14.2 kB
xet
Subiendo modelo inicial
12 months ago
scheduler.pt
1.06 kB
xet
Subiendo modelo inicial
12 months ago
sentence_bert_config.json
52 Bytes
Subiendo modelo inicial
12 months ago
special_tokens_map.json
473 Bytes
Upload tokenizer
12 months ago
tokenizer.json
3.56 MB
Upload tokenizer
12 months ago
tokenizer_config.json
7.44 kB
Upload tokenizer
12 months ago
trainer_state.json
10.7 kB
Subiendo modelo inicial
12 months ago
training_args.bin
5.82 kB
xet
Subiendo modelo inicial
12 months ago
vocab.json
798 kB
Upload tokenizer
12 months ago
vocab.txt
232 kB
Subiendo modelo inicial
12 months ago