SentenceTransformer based on thebajajra/RexBERT-base-embed-pf-v0.1

This is a sentence-transformers model finetuned from thebajajra/RexBERT-base-embed-pf-v0.1 on the nomic-embed-supervised-data dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
queries = [
    "None of this proves that Loral wasn\u0027t disloyal or criminally negligent in its dealings with China.",
]
documents = [
    'There is nothing that proves that Loral was negligent in dealing with China. ',
    'Trimipramine is a tricyclic antidepressant. Trimipramine affects chemicals in the brain that may become unbalanced. Trimipramine is used to treat symptoms of depression.Trimipramine may also be used for purposes not listed in this medication guide. You should not take trimipramine if you have recently had a heart attack. Do not use trimipramine if you have used an MAO inhibitor in the past 14 days.A dangerous drug interaction could occur.ou should not use trimipramine if you are allergic to it, or if you have: 1  if you have recently had a heart attack; or. 2  if you are allergic to antidepressants such as amitriptyline, amoxapine, clomipramine, desipramine, doxepin, imipramine, nortriptyline, or protriptyline.',
    'Lorraine Dunn Lorraine Dunn (12 September 1942 – 16 October 2003) was a Panamanian sprinter. She competed in the 4 × 100 metres relay at the 1960 Summer Olympics and the 1964 Summer Olympics. Dunn finished third in the 200 metres and finished fourth in the 80 metres hurdles at the 1963 Pan American Games.',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.9269, -0.0305,  0.0724]])

Training Details

Training Dataset

nomic-embed-supervised-data

  • Dataset: nomic-embed-supervised-data at 13eef8a
  • Size: 1,611,024 training samples
  • Columns: query, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    query positive negative
    type string string list
    details
    • min: 4 tokens
    • mean: 37.16 tokens
    • max: 1024 tokens
    • min: 5 tokens
    • mean: 99.03 tokens
    • max: 1024 tokens
    • min: 20 elements
    • mean: 155.14 elements
    • max: 209 elements
  • Samples:
    query positive negative
    Japan's biggest daily newspaper, Asahi Shimbun, published an interview Sunday with Woody Allen, who, asked to sum up the 20 th century in a single word, replied, Disappointing. Woody Allen described the 20th century as "disappointing'' in an interview with the Japaense newspaper, Asahi Shimbun. ['Billy Joel described the 20th century as "outlandish" in a recent interview with Japan's leading newspaper. ', "aside for the kids' education and--BOP--I have so much trouble with that the", 'An old man is jogging.', 'It was my favorite spot.', 'Two women are playing volleyball.', ...]
    Sedimentation is the term for when clumped solids sink to the bottom of the water. Deposition occurs where the water motion slows. ['The law of superposition is best described by: in undisturbed layers of sedimentary rock, the lowest layers contain the older rocks.', "Review: I felt this film - throughout. I waas impressed with Russell Crowe's talent in developing his relationship with Lillie, such a typical Aussie blend of softly softly approach, a bit self depreciating and very persistent. Really loved the cinematography and direction. Pace was just right and the portrayals of nearly all characters was impressive.Gosh, didn't Russell's talent even in 1993 shine! .. and I have yet to see Gladiator. Question: is it a negative review?", 'Question: How many times did Chopin and Liszy perform together in public? (Answer: seven).', 'Fact 1: Birth may be followed by a period of parental care of the offspring. \nFact 2: Human birth is the rarest of all births.', 'acts. Id. at 963. The Lawson court stated that a "critical distinction” of Pylant from the facts before it in Lawson was that in Pylant, there was "not a factu...
    Volleyball involves techniques like jumping. Volleyball Volleyball is a team sport in which two teams of six players are separated by a net . Each team tries to score points by grounding a ball on the other team 's court under organized rules . It has been a part of the official program of the Summer Olympic Games since 1964 . The complete rules are extensive . But simply , play proceeds as follows : a player on one of the teams begins a ` rally ' by serving the ball ( tossing or releasing it and then hitting it with a hand or arm ) , from behind the back boundary line of the court , over the net , and into the receiving team 's court . The receiving team must not let the ball be grounded within their court . The team may touch the ball up to 3 times but individual players may not touch the ball twice consecutively . Typically , the first two touches are used to set up for an attack , an attempt to direct the ball back over the net in such a way that the serving team is unable to prevent it from being grounded in their court . ... ['Volleyball jump serve The Volleyball jump Serve is a type of volleyball serve where the player increases the power and height of their serve by jumping into the hit . The Jump Serve itself was popularized by the brazilian national volleyball team in 1984 on the Olympics .', "Volleyball variations As volleyball is one of the world 's most popular team sports , second only to football ( soccer ) in the number of players , there are numerous variations of the basic rules . The rules have changed around the world since its creation in 1895 , as skills have developed , to make the game more suited for spectators , for learning or other special needs . Some variations have been included as a change in the international rules by Fédération Internationale de Volleyball ( FIVB ) , others have resulted in variations with specific names . The more notable variations include :", 'Volleyball (ball) A volleyball is a ball used to play indoor volleyball , beach volleyball , or other less common v...
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Evaluation Dataset

nomic-embed-supervised-data

  • Dataset: nomic-embed-supervised-data at 13eef8a
  • Size: 84,795 evaluation samples
  • Columns: query, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    query positive negative
    type string string list
    details
    • min: 4 tokens
    • mean: 34.78 tokens
    • max: 1024 tokens
    • min: 5 tokens
    • mean: 91.63 tokens
    • max: 1024 tokens
    • min: 15 elements
    • mean: 151.78 elements
    • max: 209 elements
  • Samples:
    query positive negative
    I think people are upset that they're "copying" a mechanic. Which is ridiculous. Every platformer copied from Mario. That doesn't mean they weren't good games. And this says nothing of the end result of the flow of the game. The people who say spiderman is copying another game. Are the same retarded little children who think any game that has drivable vehicles in it, is a gta rip off.

    You can't take them serious.
    ['i unfollowed all of them earlier this year theres too many and all their tweets and instaposts made up damn near half my feed i said id just wait for the music and then iridesence came out and was pretty mid so ', 'There are five, FIVE fucking boxes of doughnuts by the printer in the office this morning and I am overtired and hungry and in the worst mood, and I’m sorry r/1200isplenty, tea is not a goddamn dessert substitute; I keep drinking it and not only am I not satiated, I have to pee every ten minutes, so I’m here guzzling tea and growling ”nothing tastes as good as skinny feels” under my breath and if I sound crazy, it’s because I am. \n\nEdit: down to one box. Just checked the calories for one doughnut—380. Three hundred. And eighty. Calories. For one doughnut. ', 'What a strange thing to call your cock.', "It doesn't. That's a very personal decision. You're not right or wrong on that. I was just replying because you were clear you don't want to raise the child as your own....
    duties of a medical assistant for resume Include clinical skills in a health care resume. Those that apply to a medical assistant job may be: 1 Preparing patients, including taking of medical histories. 2 The ability to take vital signs. 3 Preparing medication and treatments. 4 Basic first aid, CPR and infection control knowledge. Assisting physicians with exams. ['Include clinical skills in a health care resume. Those that apply to a medical assistant job may be: 1 Preparing patients, including taking of medical histories. 2 The ability to take vital signs. 3 Preparing medication and treatments. Basic first aid, CPR and infection control knowledge.', 'Clinical Medical Assistant Resume. The position of a medical assistant is a key position in any hospital, clinic or a health care center. A medical assistant may perform a variety of duties, pertaining to administrative, front office or clinical duties. Here we will discuss the job profile of a clinical medical assistant. Also, there is a clinical medical assistant resume example provided below, for your further reference.', "Community Q&A. A medical assistant is a member of a health care team that undertakes administrative and clinical job duties. A health care resume must emphasize skills as well as education and job experience. Learn how to write a resume for a medical assistant job and yo...
    when did british rule end in south africa History of South Africa British colonies: Cape Colony, Natal Colony, Transvaal Colony, and Orange River Colony. The country became a self-governing nation state within the British Empire, in 1934 following enactment of the Status of the Union Act. The dominion came to an end on 31 May 1961 as the consequence of a 1960 referendum, which legitimised the country becoming a sovereign state named Republic of South Africa. A republican constitution was adopted. From 1948–1994, South African politics were dominated by Afrikaner nationalism. Racial segregation and white minority rule known officially as apartheid, an Afrikaans word meaning "separateness”, came into existence in 1948 (under ["South Africa 1910. The Union was a dominion that included the former territories of the Cape, Transvaal and Natal colonies, as well as the Orange Free State republic. The Natives' Land Act of 1913 severely restricted the ownership of land by blacks; at that stage natives controlled only 7% of the country. The amount of land reserved for indigenous peoples was later marginally increased. In 1931, the union was fully sovereign from the United Kingdom with the passage of the Statute of Westminster, which abolished the last powers of the British Government on the country. In 1934, the South African Party and", 'History of South Africa conducted raids in Free State territories. Both sides adopted scorched-earth tactics, with large swathes of pasturage and cropland being destroyed. Faced with starvation, Moshoeshoe signed a peace treaty on 15 October 1858, though crucial boundary issues remained unresolved. War broke out again in 1865. After an unsuccessful appeal for aid from the British ...
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-06
  • num_train_epochs: 20
  • warmup_ratio: 0.1
  • bf16: True
  • dataloader_num_workers: 10
  • dataloader_prefetch_factor: 10

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-06
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 20
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 10
  • dataloader_prefetch_factor: 10
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss
0.0127 10 1.2782 -
0.0254 20 1.2831 -
0.0382 30 1.2715 -
0.0509 40 1.2557 -
0.0636 50 1.3035 -
0.0763 60 1.2501 -
0.0891 70 1.2792 -
0.1018 80 1.2826 -
0.1145 90 1.2824 -
0.1272 100 1.3051 -
0.1399 110 1.2809 -
0.1527 120 1.2685 -
0.1654 130 1.2746 -
0.1781 140 1.2672 -
0.1908 150 1.2706 -
0.2036 160 1.2905 -
0.2163 170 1.268 -
0.2290 180 1.2641 -
0.2417 190 1.2823 -
0.2545 200 1.2985 -
0.2672 210 1.2695 -
0.2799 220 1.2772 -
0.2926 230 1.281 -
0.3053 240 1.2709 -
0.3181 250 1.2797 -
0.3308 260 1.2659 -
0.3435 270 1.2721 -
0.3562 280 1.2654 -
0.3690 290 1.2797 -
0.3817 300 1.2656 -
0.3944 310 1.2868 -
0.4071 320 1.2663 -
0.4198 330 1.2686 -
0.4326 340 1.2833 -
0.4453 350 1.3006 -
0.4580 360 1.2474 -
0.4707 370 1.2655 -
0.4835 380 1.2561 -
0.4962 390 1.2541 -
0.5 393 - 1.1388
0.5089 400 1.2582 -
0.5216 410 1.2614 -
0.5344 420 1.2473 -
0.5471 430 1.245 -
0.5598 440 1.2457 -
0.5725 450 1.2705 -
0.5852 460 1.2537 -
0.5980 470 1.2481 -
0.6107 480 1.245 -
0.6234 490 1.2499 -
0.6361 500 1.2403 -
0.6489 510 1.2334 -
0.6616 520 1.2484 -
0.6743 530 1.2255 -
0.6870 540 1.2394 -
0.6997 550 1.2209 -
0.7125 560 1.2383 -
0.7252 570 1.2522 -
0.7379 580 1.2261 -
0.7506 590 1.2171 -
0.7634 600 1.2461 -
0.7761 610 1.2123 -
0.7888 620 1.2174 -
0.8015 630 1.2326 -
0.8142 640 1.227 -
0.8270 650 1.2265 -
0.8397 660 1.2305 -
0.8524 670 1.2132 -
0.8651 680 1.2076 -
0.8779 690 1.2094 -
0.8906 700 1.2163 -
0.9033 710 1.1944 -
0.9160 720 1.2035 -
0.9288 730 1.1791 -
0.9415 740 1.1877 -
0.9542 750 1.1858 -
0.9669 760 1.1756 -
0.9796 770 1.1509 -
0.9924 780 1.1491 -
1.0 786 - 1.0399
1.0051 790 1.1459 -
1.0178 800 1.1585 -
1.0305 810 1.158 -
1.0433 820 1.1474 -
1.0560 830 1.1399 -
1.0687 840 1.1487 -
1.0814 850 1.138 -
1.0941 860 1.1133 -
1.1069 870 1.1085 -
1.1196 880 1.1271 -
1.1323 890 1.1232 -
1.1450 900 1.109 -
1.1578 910 1.0807 -
1.1705 920 1.0983 -
1.1832 930 1.0837 -
1.1959 940 1.0649 -
1.2087 950 1.0596 -
1.2214 960 1.0741 -
1.2341 970 1.0736 -
1.2468 980 1.062 -
1.2595 990 1.0579 -
1.2723 1000 1.0194 -
1.2850 1010 1.0345 -
1.2977 1020 1.0193 -
1.3104 1030 1.0243 -
1.3232 1040 0.9892 -
1.3359 1050 0.9933 -
1.3486 1060 0.9802 -
1.3613 1070 0.9755 -
1.3740 1080 0.9636 -
1.3868 1090 0.9487 -
1.3995 1100 0.9552 -
1.4122 1110 0.9436 -
1.4249 1120 0.925 -
1.4377 1130 0.9186 -
1.4504 1140 0.8905 -
1.4631 1150 0.8737 -
1.4758 1160 0.8613 -
1.4885 1170 0.8602 -
1.5 1179 - 0.7025
1.5013 1180 0.8286 -
1.5140 1190 0.8159 -
1.5267 1200 0.8142 -
1.5394 1210 0.7728 -
1.5522 1220 0.7741 -
1.5649 1230 0.755 -
1.5776 1240 0.7797 -
1.5903 1250 0.7565 -
1.6031 1260 0.7162 -
1.6158 1270 0.7182 -
1.6285 1280 0.7005 -
1.6412 1290 0.6993 -
1.6539 1300 0.694 -
1.6667 1310 0.6761 -
1.6794 1320 0.6702 -
1.6921 1330 0.6657 -
1.7048 1340 0.655 -
1.7176 1350 0.6436 -
1.7303 1360 0.615 -
1.7430 1370 0.6324 -
1.7557 1380 0.6312 -
1.7684 1390 0.6078 -
1.7812 1400 0.6058 -
1.7939 1410 0.5995 -
1.8066 1420 0.6053 -
1.8193 1430 0.5753 -
1.8321 1440 0.5769 -
1.8448 1450 0.5719 -
1.8575 1460 0.5822 -
1.8702 1470 0.5666 -
1.8830 1480 0.5544 -
1.8957 1490 0.5284 -
1.9084 1500 0.5393 -
1.9211 1510 0.5213 -
1.9338 1520 0.5298 -
1.9466 1530 0.513 -
1.9593 1540 0.5211 -
1.9720 1550 0.514 -
1.9847 1560 0.5019 -
1.9975 1570 0.4971 -
2.0 1572 - 0.3834
2.0102 1580 0.4904 -
2.0229 1590 0.5015 -
2.0356 1600 0.4777 -
2.0483 1610 0.4928 -
2.0611 1620 0.473 -
2.0738 1630 0.4781 -
2.0865 1640 0.4715 -
2.0992 1650 0.462 -
2.1120 1660 0.4628 -
2.1247 1670 0.4703 -
2.1374 1680 0.4709 -
2.1501 1690 0.4631 -
2.1628 1700 0.4563 -
2.1756 1710 0.4638 -
2.1883 1720 0.4571 -
2.2010 1730 0.4551 -
2.2137 1740 0.4353 -
2.2265 1750 0.4472 -
2.2392 1760 0.4539 -
2.2519 1770 0.4508 -
2.2646 1780 0.4529 -
2.2774 1790 0.4507 -
2.2901 1800 0.4593 -
2.3028 1810 0.4611 -
2.3155 1820 0.4455 -
2.3282 1830 0.4302 -
2.3410 1840 0.4352 -
2.3537 1850 0.4395 -
2.3664 1860 0.4282 -
2.3791 1870 0.4409 -
2.3919 1880 0.4243 -
2.4046 1890 0.4256 -
2.4173 1900 0.4419 -
2.4300 1910 0.4262 -
2.4427 1920 0.4254 -
2.4555 1930 0.4117 -
2.4682 1940 0.4294 -
2.4809 1950 0.4227 -
2.4936 1960 0.4192 -
2.5 1965 - 0.3127
2.5064 1970 0.4212 -
2.5191 1980 0.4288 -
2.5318 1990 0.4076 -
2.5445 2000 0.4158 -
2.5573 2010 0.4186 -
2.5700 2020 0.4213 -
2.5827 2030 0.4144 -
2.5954 2040 0.4343 -
2.6081 2050 0.4193 -
2.6209 2060 0.4149 -
2.6336 2070 0.4181 -
2.6463 2080 0.411 -
2.6590 2090 0.4142 -
2.6718 2100 0.4232 -
2.6845 2110 0.4083 -
2.6972 2120 0.4192 -
2.7099 2130 0.4233 -
2.7226 2140 0.4196 -
2.7354 2150 0.4265 -
2.7481 2160 0.4176 -
2.7608 2170 0.4165 -
2.7735 2180 0.3897 -
2.7863 2190 0.4119 -
2.7990 2200 0.4181 -
2.8117 2210 0.4214 -
2.8244 2220 0.4122 -
2.8372 2230 0.407 -
2.8499 2240 0.4068 -
2.8626 2250 0.4125 -
2.8753 2260 0.4153 -
2.8880 2270 0.4119 -
2.9008 2280 0.4168 -
2.9135 2290 0.4219 -
2.9262 2300 0.4149 -
2.9389 2310 0.4244 -
2.9517 2320 0.4052 -
2.9644 2330 0.4073 -
2.9771 2340 0.4116 -
2.9898 2350 0.4148 -
3.0 2358 - 0.2973
3.0025 2360 0.4002 -
3.0153 2370 0.4056 -
3.0280 2380 0.4086 -
3.0407 2390 0.4131 -
3.0534 2400 0.4066 -
3.0662 2410 0.4062 -
3.0789 2420 0.4116 -
3.0916 2430 0.3935 -
3.1043 2440 0.405 -
3.1170 2450 0.3993 -
3.1298 2460 0.4062 -
3.1425 2470 0.4027 -
3.1552 2480 0.3876 -
3.1679 2490 0.4075 -
3.1807 2500 0.4015 -
3.1934 2510 0.4137 -
3.2061 2520 0.4074 -
3.2188 2530 0.4134 -
3.2316 2540 0.4139 -
3.2443 2550 0.4106 -
3.2570 2560 0.4103 -
3.2697 2570 0.3893 -
3.2824 2580 0.4114 -
3.2952 2590 0.407 -
3.3079 2600 0.4025 -
3.3206 2610 0.3997 -
3.3333 2620 0.4054 -
3.3461 2630 0.4075 -
3.3588 2640 0.3926 -
3.3715 2650 0.3924 -
3.3842 2660 0.4245 -
3.3969 2670 0.4087 -
3.4097 2680 0.402 -
3.4224 2690 0.4094 -
3.4351 2700 0.4044 -
3.4478 2710 0.4131 -
3.4606 2720 0.4098 -
3.4733 2730 0.3998 -
3.4860 2740 0.3948 -
3.4987 2750 0.4009 -
3.5 2751 - 0.2923
3.5115 2760 0.3911 -
3.5242 2770 0.3964 -
3.5369 2780 0.4002 -
3.5496 2790 0.4055 -
3.5623 2800 0.4008 -
3.5751 2810 0.3956 -
3.5878 2820 0.4164 -
3.6005 2830 0.3971 -
3.6132 2840 0.3946 -
3.6260 2850 0.3974 -
3.6387 2860 0.3901 -
3.6514 2870 0.3997 -
3.6641 2880 0.4094 -
3.6768 2890 0.4031 -
3.6896 2900 0.4042 -
3.7023 2910 0.3977 -
3.7150 2920 0.3974 -
3.7277 2930 0.4059 -
3.7405 2940 0.3967 -
3.7532 2950 0.3981 -
3.7659 2960 0.3951 -
3.7786 2970 0.4044 -
3.7913 2980 0.3964 -
3.8041 2990 0.3925 -
3.8168 3000 0.3988 -
3.8295 3010 0.3913 -
3.8422 3020 0.4037 -
3.8550 3030 0.4008 -
3.8677 3040 0.3984 -
3.8804 3050 0.395 -
3.8931 3060 0.4005 -
3.9059 3070 0.3934 -
3.9186 3080 0.3858 -
3.9313 3090 0.3984 -
3.9440 3100 0.392 -
3.9567 3110 0.4011 -
3.9695 3120 0.3946 -
3.9822 3130 0.3883 -
3.9949 3140 0.3939 -
4.0 3144 - 0.2901
4.0076 3150 0.3879 -
4.0204 3160 0.3909 -
4.0331 3170 0.3975 -
4.0458 3180 0.3987 -
4.0585 3190 0.3919 -
4.0712 3200 0.4138 -
4.0840 3210 0.382 -
4.0967 3220 0.4063 -
4.1094 3230 0.3885 -
4.1221 3240 0.4015 -
4.1349 3250 0.4007 -
4.1476 3260 0.4045 -
4.1603 3270 0.4094 -
4.1730 3280 0.386 -
4.1858 3290 0.4054 -
4.1985 3300 0.3927 -
4.2112 3310 0.3996 -
4.2239 3320 0.4042 -
4.2366 3330 0.3954 -
4.2494 3340 0.3827 -
4.2621 3350 0.4022 -
4.2748 3360 0.4044 -
4.2875 3370 0.3988 -
4.3003 3380 0.4062 -
4.3130 3390 0.4005 -
4.3257 3400 0.4067 -
4.3384 3410 0.3825 -
4.3511 3420 0.3878 -
4.3639 3430 0.4012 -
4.3766 3440 0.3966 -
4.3893 3450 0.3871 -
4.4020 3460 0.3998 -
4.4148 3470 0.4033 -
4.4275 3480 0.4069 -
4.4402 3490 0.3994 -
4.4529 3500 0.3969 -
4.4656 3510 0.398 -
4.4784 3520 0.3945 -
4.4911 3530 0.3884 -
4.5 3537 - 0.2890
4.5038 3540 0.4028 -
4.5165 3550 0.3852 -
4.5293 3560 0.3986 -
4.5420 3570 0.3942 -
4.5547 3580 0.4043 -
4.5674 3590 0.4064 -
4.5802 3600 0.3904 -
4.5929 3610 0.3934 -
4.6056 3620 0.3861 -
4.6183 3630 0.3927 -
4.6310 3640 0.4085 -
4.6438 3650 0.3951 -
4.6565 3660 0.408 -
4.6692 3670 0.3902 -
4.6819 3680 0.3966 -
4.6947 3690 0.4102 -
4.7074 3700 0.3961 -
4.7201 3710 0.4022 -
4.7328 3720 0.4056 -
4.7455 3730 0.4078 -
4.7583 3740 0.3974 -
4.7710 3750 0.3961 -
4.7837 3760 0.3861 -
4.7964 3770 0.3933 -
4.8092 3780 0.3902 -
4.8219 3790 0.3938 -
4.8346 3800 0.3913 -
4.8473 3810 0.3985 -
4.8601 3820 0.4055 -
4.8728 3830 0.3933 -
4.8855 3840 0.3968 -
4.8982 3850 0.3959 -
4.9109 3860 0.3933 -
4.9237 3870 0.4057 -
4.9364 3880 0.4023 -
4.9491 3890 0.3938 -
4.9618 3900 0.3969 -
4.9746 3910 0.4019 -
4.9873 3920 0.3922 -
5.0 3930 0.3955 0.2884
5.0127 3940 0.3968 -
5.0254 3950 0.3899 -
5.0382 3960 0.3861 -
5.0509 3970 0.3866 -
5.0636 3980 0.4033 -
5.0763 3990 0.4031 -
5.0891 4000 0.4154 -
5.1018 4010 0.4002 -
5.1145 4020 0.4006 -
5.1272 4030 0.3959 -
5.1399 4040 0.4043 -
5.1527 4050 0.3904 -
5.1654 4060 0.3939 -
5.1781 4070 0.3824 -
5.1908 4080 0.3945 -
5.2036 4090 0.3897 -
5.2163 4100 0.3983 -
5.2290 4110 0.3925 -
5.2417 4120 0.4056 -
5.2545 4130 0.4121 -
5.2672 4140 0.4013 -
5.2799 4150 0.4016 -
5.2926 4160 0.3944 -
5.3053 4170 0.393 -
5.3181 4180 0.3999 -
5.3308 4190 0.3833 -
5.3435 4200 0.3719 -
5.3562 4210 0.3934 -
5.3690 4220 0.4009 -
5.3817 4230 0.3843 -
5.3944 4240 0.4003 -
5.4071 4250 0.3971 -
5.4198 4260 0.399 -
5.4326 4270 0.4037 -
5.4453 4280 0.3965 -
5.4580 4290 0.3849 -
5.4707 4300 0.4083 -
5.4835 4310 0.4102 -
5.4962 4320 0.3944 -
5.5 4323 - 0.2881
5.5089 4330 0.4029 -
5.5216 4340 0.3967 -
5.5344 4350 0.3874 -
5.5471 4360 0.3922 -
5.5598 4370 0.3903 -
5.5725 4380 0.402 -
5.5852 4390 0.4023 -
5.5980 4400 0.4039 -
5.6107 4410 0.3954 -
5.6234 4420 0.4044 -
5.6361 4430 0.4067 -
5.6489 4440 0.391 -
5.6616 4450 0.3968 -
5.6743 4460 0.4016 -
5.6870 4470 0.4007 -
5.6997 4480 0.3893 -
5.7125 4490 0.3978 -
5.7252 4500 0.3981 -
5.7379 4510 0.3908 -
5.7506 4520 0.3952 -
5.7634 4530 0.4013 -
5.7761 4540 0.3892 -
5.7888 4550 0.4012 -
5.8015 4560 0.3881 -
5.8142 4570 0.3799 -
5.8270 4580 0.3933 -
5.8397 4590 0.3787 -
5.8524 4600 0.3886 -
5.8651 4610 0.4028 -
5.8779 4620 0.4022 -
5.8906 4630 0.4038 -
5.9033 4640 0.3886 -
5.9160 4650 0.3981 -
5.9288 4660 0.3884 -
5.9415 4670 0.3992 -
5.9542 4680 0.4065 -
5.9669 4690 0.3918 -
5.9796 4700 0.4078 -
5.9924 4710 0.3928 -
6.0 4716 - 0.2879
6.0051 4720 0.3939 -
6.0178 4730 0.4158 -
6.0305 4740 0.4007 -
6.0433 4750 0.3929 -
6.0560 4760 0.3977 -
6.0687 4770 0.3993 -
6.0814 4780 0.3776 -
6.0941 4790 0.391 -
6.1069 4800 0.3894 -
6.1196 4810 0.397 -
6.1323 4820 0.4013 -
6.1450 4830 0.3871 -
6.1578 4840 0.3901 -
6.1705 4850 0.3958 -
6.1832 4860 0.3978 -
6.1959 4870 0.385 -
6.2087 4880 0.3959 -
6.2214 4890 0.3897 -
6.2341 4900 0.399 -
6.2468 4910 0.3987 -
6.2595 4920 0.3971 -
6.2723 4930 0.3914 -
6.2850 4940 0.4009 -
6.2977 4950 0.3919 -
6.3104 4960 0.3933 -
6.3232 4970 0.3919 -
6.3359 4980 0.3761 -
6.3486 4990 0.3865 -
6.3613 5000 0.3816 -
6.3740 5010 0.3898 -
6.3868 5020 0.3919 -
6.3995 5030 0.3908 -
6.4122 5040 0.4045 -
6.4249 5050 0.4044 -
6.4377 5060 0.3916 -
6.4504 5070 0.3954 -
6.4631 5080 0.4123 -
6.4758 5090 0.4027 -
6.4885 5100 0.403 -
6.5 5109 - 0.2877
6.5013 5110 0.4009 -
6.5140 5120 0.3901 -
6.5267 5130 0.4065 -
6.5394 5140 0.4033 -
6.5522 5150 0.3908 -
6.5649 5160 0.3996 -
6.5776 5170 0.4002 -
6.5903 5180 0.3884 -
6.6031 5190 0.399 -
6.6158 5200 0.3921 -
6.6285 5210 0.3913 -
6.6412 5220 0.3926 -
6.6539 5230 0.3929 -
6.6667 5240 0.3965 -
6.6794 5250 0.3944 -
6.6921 5260 0.4034 -
6.7048 5270 0.3867 -
6.7176 5280 0.4031 -
6.7303 5290 0.3926 -
6.7430 5300 0.3923 -
6.7557 5310 0.3977 -
6.7684 5320 0.3987 -
6.7812 5330 0.4027 -
6.7939 5340 0.3995 -
6.8066 5350 0.405 -
6.8193 5360 0.4113 -
6.8321 5370 0.3974 -
6.8448 5380 0.4014 -
6.8575 5390 0.3972 -
6.8702 5400 0.3953 -
6.8830 5410 0.3992 -
6.8957 5420 0.393 -
6.9084 5430 0.3865 -
6.9211 5440 0.3892 -
6.9338 5450 0.3955 -
6.9466 5460 0.3954 -
6.9593 5470 0.4041 -
6.9720 5480 0.3927 -
6.9847 5490 0.416 -
6.9975 5500 0.3895 -
7.0 5502 - 0.2877
7.0102 5510 0.3927 -
7.0229 5520 0.3896 -
7.0356 5530 0.3926 -
7.0483 5540 0.3907 -
7.0611 5550 0.3916 -
7.0738 5560 0.399 -
7.0865 5570 0.3882 -
7.0992 5580 0.4003 -
7.1120 5590 0.3928 -
7.1247 5600 0.4001 -
7.1374 5610 0.4098 -
7.1501 5620 0.3948 -
7.1628 5630 0.3988 -
7.1756 5640 0.3893 -
7.1883 5650 0.3905 -
7.2010 5660 0.4024 -
7.2137 5670 0.4011 -
7.2265 5680 0.3959 -
7.2392 5690 0.4014 -
7.2519 5700 0.3932 -
7.2646 5710 0.385 -
7.2774 5720 0.3964 -
7.2901 5730 0.3922 -
7.3028 5740 0.3864 -
7.3155 5750 0.3864 -
7.3282 5760 0.4018 -
7.3410 5770 0.406 -
7.3537 5780 0.3975 -
7.3664 5790 0.405 -
7.3791 5800 0.3872 -
7.3919 5810 0.3885 -
7.4046 5820 0.3942 -
7.4173 5830 0.3913 -
7.4300 5840 0.41 -
7.4427 5850 0.3957 -
7.4555 5860 0.4052 -
7.4682 5870 0.4082 -
7.4809 5880 0.3889 -
7.4936 5890 0.4008 -
7.5 5895 - 0.2876
7.5064 5900 0.4027 -
7.5191 5910 0.4029 -
7.5318 5920 0.398 -
7.5445 5930 0.3914 -
7.5573 5940 0.3943 -
7.5700 5950 0.3983 -
7.5827 5960 0.3955 -
7.5954 5970 0.3984 -
7.6081 5980 0.3987 -
7.6209 5990 0.391 -
7.6336 6000 0.4049 -
7.6463 6010 0.3837 -
7.6590 6020 0.3899 -
7.6718 6030 0.3945 -
7.6845 6040 0.398 -
7.6972 6050 0.3902 -
7.7099 6060 0.3926 -
7.7226 6070 0.4101 -
7.7354 6080 0.3832 -
7.7481 6090 0.3985 -
7.7608 6100 0.3847 -
7.7735 6110 0.3905 -
7.7863 6120 0.3756 -
7.7990 6130 0.3921 -
7.8117 6140 0.3906 -
7.8244 6150 0.397 -
7.8372 6160 0.3971 -
7.8499 6170 0.4037 -
7.8626 6180 0.3873 -
7.8753 6190 0.3871 -
7.8880 6200 0.3917 -
7.9008 6210 0.3843 -
7.9135 6220 0.4039 -
7.9262 6230 0.4004 -
7.9389 6240 0.4038 -
7.9517 6250 0.3908 -
7.9644 6260 0.4022 -
7.9771 6270 0.4061 -
7.9898 6280 0.3799 -
8.0 6288 - 0.2875

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.1
  • PyTorch: 2.8.0+cu129
  • Accelerate: 1.11.0
  • Datasets: 4.3.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
39
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for thebajajra/RexBERT-base-embed-pf-v0.4

Finetuned
(2)
this model

Dataset used to train thebajajra/RexBERT-base-embed-pf-v0.4