SentenceTransformer based on google/embeddinggemma-300m

This is a sentence-transformers model finetuned from google/embeddinggemma-300m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: google/embeddinggemma-300m
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'Gemma3TextModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("ahmedHamdi/narrative-similarity-fr-en-gemma-masked-NE")
# Run inference
sentences = [
    "A young man announces to his closest friends that he is about to marry a young woman from a wealthy family who has rejected him without a second thought. To mark the end of his bachelorhood, they decide to throw him a party he won't soon forget...",
    "ORG animal PERSON (PERSON), who makes his living as a Catholic-school bus driver, decides to settle down and marry his girlfriend PERSON (PERSON). After learning the news of the engagement, PERSON's shocked friends, led by PERSON (PERSON), decide to throw him an epic bachelor party. The bride's wealthy, conservative parents are unhappy with her decision, and her father enlists the help of ORG's ex-boyfriend PERSON (PERSON) to sabotage her relationship with PERSON and win her back. While ORG worries and goes off to a bridal shower thrown by her friends, PERSON heads to ORG, which takes place in a lavish, spacious hotel suite, and promises to remain faithful. Both parties start off on the wrong foot because of Cole's meddling. As the bachelor party starts to heat up, ORG and the girls decide to get even with PERSON and his friends by having a party of their own. Both parties eventually collide, leading to ORG accusing PERSON of infidelity. The bachelor party becomes a wild, drunken orgy and the hotel room is trashed, which infuriates the hotel's frustrated manager (PERSON). Adding to the confusion is PERSON's friend PERSON, who has become despondent over the breakup of his marriage and botches several suicide attempts. PERSON convinces PERSON of his love and faithfulness just as the party is raided by the police. In the ensuing melee, PERSON and ORG become separated and Cole kidnaps ORG, so PERSON and his friends chase after them. The chase culminates in a showdown between PERSON and Cole in a 36-screen movie theater, with a fist fight taking place in synchronization with a similar fight being shown in a 3D film projected behind them; the audience believes that the real fight is an extraordinary 3D effect. PERSON wins the fight and is reunited with ORG. After the wedding, PERSON and PERSON are driven to the airport for their honeymoon in PERSON's school bus, which is driven by a laughing PERSON.",
    'Film historian PERSON provides a synopsis of it, a film no longer known to exist in any archive:',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.6971,  0.0240],
#         [ 0.6971,  1.0000, -0.2039],
#         [ 0.0240, -0.2039,  1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 17,628 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 7 tokens
    • mean: 122.58 tokens
    • max: 256 tokens
    • min: 3 tokens
    • mean: 226.23 tokens
    • max: 256 tokens
  • Samples:
    sentence_0 sentence_1
    In GPE, a lonely boy befriends his high school security guard who decides to help him get into hip-hop. In GPE, a single mother, PERSON, has cooked dinner and sends her daughter, PERSON, out to find her brother, August. PERSON and August are the victims of a shooting and PERSON dies. 18 months later August suffers from PTSD and anxiety attacks due to the incident and spends most of his time in his room. PERSON goes to work every day and is overprotective of August. August pines for the girl of his dreams PERSON and watches from his window as his best friend Laz sell drugs. School principal PERSON, tells her staff that she is going to cut personnel if attendance does not improve. She hires her soon to be ex-husband, PERSON, as the school security guard and tasks him with trying to get August, who has not been to school for months, to return to class. PERSON is unaware of August's issues and scares him when he enters his bedroom after hearing August working on a piece of music. PERSON secretly befriends August and sees the boy as his ticket back to the hip-hop music scene and a way to keep...
    In the early 1960s, British businessman PERSON (PERSON), due to his frequent travels to LOC, is recruited by PERSON, in conjunction with the ORG. He is tasked with going to the GPE to make contact with PERSON (PERSON), a colonel in ORG military intelligence service (GRU). Penkovsky fears that PERSON impulsiveness might lead him to trigger a nuclear conflict with LOC. In 1960, PERSON, a high ranking Soviet official and ORG intelligence officer with access to top secret nuclear information, is disillusioned with PERSON's leadership in light of the growing threat of a nuclear war with GPE. He reaches out to the ORG and offers to provide information that could help de-escalate the situation. The ORG and PERSON decide that it would be better not to use an officer and instead have an ordinary businessman act as an intermediary. They approach salesman PERSON to go to GPE under the pretense of exploring commercial opportunities. PERSON establishes seemingly normal business relations with PERSON and the latter makes arrangements with western intelligence agencies to feed them information. He asks that they continue to use PERSON as their regular courier, reasoning that he will be under the Soviets' radar. Initially opposed to the task, PERSON eventually agrees, partly after ORG officer PERSON emphasizes that his efforts could help prevent a nuclear war and ...
    During the Warring States period, when GPE was divided into seven constantly warring regions, the great King of PERSON, through numerous battles and violent invasions, dreamed of reunifying his now-fragmented empire. PERSON, PERSON, and ORG, three of the greatest warriors of the PERSON kingdom, also known as the most active and dangerous conspirators, were said to have been slain by a man about whom little was known: Nameless. To reward him properly, PERSON wished to receive the mysterious hero. While no one was allowed to approach the King within one hundred paces, ORG, upon arriving at the palace, received gold and land and was permitted to stand twenty paces away from him, then ten, for having defeated PERSON and Snowflake. The King then asked him to tell his story. Nameless explains that PERSON and PERSON were lovers, but that ORG had had an affair with PERSON. PERSON hadn't forgiven him, so Nameless knew that to separate PERSON and PERSON, he first had to defeat PERSON. Nameless c... In the middle of the Warring States period, ORG, a PERSON prefect, arrives at the PERSON capital city to meet the king, who has survived multiple attempts on his life by the assassins Long Sky, Flying Snow and PERSON. As a result, the king has implemented extreme security measures: no visitors are allowed to approach the king within 100 paces. Nameless claims that he has slain the three assassins, and their weapons are displayed before the king, who allows the former to approach within ten paces and tell him his story. Nameless recounts first killing PERSON at a gaming house, before traveling to meet ORG and PERSON, who have taken refuge at a calligraphy school in the PERSON state, in a city which was under siege by the PERSON army. He tells PERSON that he is there to commission a calligraphy scroll with the character for PERSON (劍), secretly seeking to learn PERSON's skill through his calligraphy. Nameless also learns that PERSON and PERSON, formerly lovers, have grown distant. Once ...
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: None
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss
0.1135 500 0.0453
0.2269 1000 0.07
0.3404 1500 0.1489
0.4538 2000 0.1628
0.5673 2500 0.1959
0.6807 3000 0.1272
0.7942 3500 0.1977
0.9076 4000 0.173
1.0211 4500 0.1484
1.1346 5000 0.0902
1.2480 5500 0.11
1.3615 6000 0.1178
1.4749 6500 0.1181
1.5884 7000 0.0747
1.7018 7500 0.0857
1.8153 8000 0.1008
1.9287 8500 0.0765
2.0422 9000 0.0602
2.1557 9500 0.0294
2.2691 10000 0.0431
2.3826 10500 0.0323
2.4960 11000 0.0365
2.6095 11500 0.0421
2.7229 12000 0.0267
2.8364 12500 0.0214
2.9499 13000 0.0268

Framework Versions

  • Python: 3.9.18
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.6
  • PyTorch: 2.8.0+cu128
  • Accelerate: 1.10.1
  • Datasets: 4.5.0
  • Tokenizers: 0.22.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
5
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ahmedHamdi/narrative-similarity-fr-en-gemma-masked-NE

Finetuned
(223)
this model

Papers for ahmedHamdi/narrative-similarity-fr-en-gemma-masked-NE