SentenceTransformer based on KiruruP/anime-recommendation-multilingual-mpnet-base-v2

This is a sentence-transformers model finetuned from KiruruP/anime-recommendation-multilingual-mpnet-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("KiruruP/anime-recommendation-multilingual-mpnet-base-v2-v3")
# Run inference
sentences = [
    '"Who is blackmailing the members of Class F in a school where their avatars are acting strangely?"',
    "The blockheads of Class F return with more misadventures! Rather than desperately competing against the elite students in Class A for better facilities, they have other problems at hand. While the girls are constantly vying for the boys' attention, Akihisa Yoshii and Yuuji Sakamoto are being blackmailed by a stalker who threatens to reveal their most embarrassing secrets to the whole school. Moreover, everyone's avatar starts to behave strangely. Filled with more nosebleeds and eye-pokes, the boys of Class F must work together to discover the stalker's identity and deal with the misfortunes that come with love among fools.",
    'Amatsuyu "Jouro" Kisaragi is a completely average second-year high school student who has two dates over one weekend\u2060—with the student council president Sakura "Cosmos" Akino on Saturday, then with his childhood friend Aoi "Himawari" Hinata on Sunday. Sadly for Jouro, both girls proclaim their love for his best friend Taiyou "Sun-chan" Ooga, the ace of the baseball team. Accepting each of their requests for advice and guidance, he is now responsible for helping the two girls win the heart of the same guy. Unbeknownst to his friends, Jouro\'s friendly and obtuse image is all but a ruse designed to cast himself as the clueless protagonist of a textbook romantic comedy. A schemer under his cheery facade, he makes the best of this unexpected turn of events with a new plan: get Sun-chan to fall for either Cosmos or Himawari and take the other as his own prize. But Jouro\'s last-ditch effort is threatened by the gloomy, four-eyed Sumireko "Pansy" Sanshokuin, who surprises Jouro with not only her knowledge of his secret personality but also a confession to the true self he hid for all this time. Stuck in this hilariously messy situation, each of the five students must navigate countless lies, traps, and misunderstandings to come out on top.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9794, 0.6574],
#         [0.9794, 1.0000, 0.6667],
#         [0.6574, 0.6667, 1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 3,358 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 label
    type string string float
    details
    • min: 15 tokens
    • mean: 47.92 tokens
    • max: 128 tokens
    • min: 51 tokens
    • mean: 123.01 tokens
    • max: 128 tokens
    • min: 1.0
    • mean: 1.0
    • max: 1.0
  • Samples:
    sentence_0 sentence_1 label
    What is the story about the kind-hearted guardian in a small church who has a secret crush on the pastor and protects the villagers with her mysterious powers? In a small church atop a hill, Pastor Lawrence looks after Saint Cecilia. Her presence is a beacon of hope for the villagers, as her mysterious powers protect them from the weakness of their minds and outside evils. While she appears dignified in front of the villagers, Cecilia allows her lazy side to show when alone with the pastor. Although Cecilia is the guardian of the people, Lawrence is determined to shield her from harm. Unbeknownst to Lawrence, Cecilia harbors a huge crush on him. She accompanies him on his shopping trips and provides him with divine protection every day. Despite Lawrence remaining oblivious to her true feelings, Cecilia continues to shower him with obvious displays of affection as the bond between them grows. 1.0
    What is a query for a natural language search engine to find a story about a cadet with no space experience, inspired by a passionate woman to join a failing space program and change his country's fate? Shirotsugh "Shiro" Lhadatt may be a cadet in the Kingdom of Honneamise's Royal Space Force (RSF), but he has never been in space before—in fact, nobody has. The RSF is often regarded as a failure both by the country's citizens and a government more interested in precipitating a war with a neighboring country than scientific achievement. Following the funeral of a fellow cadet, an unmotivated Shiro is walking in the city one night, when he bumps into Riquinni Nonderaiko, a young, pious woman, genuinely enthusiastic about the significance of space exploration. As the two gradually bond, Riquinni's encouragement inspires Shiro to volunteer as a pilot for a prospective rocket ship, potentially becoming Honneamise's first man in space. Shiro and the RSF are soon joined by a team of elderly but eager scientists and engineers, and together, they embark on a mission to mold their nation's space program into a success. However, their efforts soon catch the attention of the government, which see... 1.0
    What is the strongest woman on the planet doing to prevent All For One from seizing her quirk and aiding her allies against Tomura Shigaraki and All Might? Following an all-out battle with the Paranormal Liberation Front, it is difficult for the people of Japan to continue placing faith in their heroes. To combat the combined power of Tomura Shigaraki and All For One, All Might calls for his ally from the West—the strongest woman on the planet, Star and Stripe. However, All For One decides to intercept Star and her fleet to get his hands on her overpowered quirk before she can enter Japanese airspace. Although Endeavor, Hawks, and Best Jeanist are headed to the rendezvous point, Star makes a gamble in the present to save her comrades. 1.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 10
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss
2.3810 500 0.0177
4.7619 1000 0.0038
7.1429 1500 0.0015
9.5238 2000 0.001

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.0
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.10.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
2
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KiruruP/anime-recommendation-multilingual-mpnet-base-v2-v3