Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
• 1908.10084 • Published
• 12
This is a sentence-transformers model finetuned from isy-thl/multilingual-e5-base-learning-outcome-skill-tuned. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'query: Tenses explained. Future tense (will / going to)Present tense (simple / progressive)Past tense (simple / progressive)Present perfect tense (simple / progressive)Past perfect tense (simple / progressive)',
'passage: languages not further defined.',
'passage: transport services. Transport is the study of operating, navigating and directing ships, train, aircraft and other forms of transportation. Programmes and qualifications with the following main content are classified here: Aircraft operation Air traffic control Air traffic safety Cabin crew training Communication (air, railway, road etc.) programmes Crane and truck driving Driving programmes Flying and navigation Navigation technologies Postal service Railway operations Road motor vehicle operations Ship operation Shipping Transport studies',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000, 0.4377, -0.0224],
# [ 0.4377, 1.0000, -0.1168],
# [-0.0224, -0.1168, 1.0000]])
iscedfInformationRetrievalEvaluator| Metric | Value |
|---|---|
| cosine_accuracy@1 | 0.6824 |
| cosine_accuracy@3 | 0.8471 |
| cosine_accuracy@5 | 0.9059 |
| cosine_accuracy@10 | 0.9529 |
| cosine_precision@1 | 0.6824 |
| cosine_precision@3 | 0.2824 |
| cosine_precision@5 | 0.1812 |
| cosine_precision@10 | 0.0953 |
| cosine_recall@1 | 0.6824 |
| cosine_recall@3 | 0.8471 |
| cosine_recall@5 | 0.9059 |
| cosine_recall@10 | 0.9529 |
| cosine_ndcg@5 | 0.8018 |
| cosine_ndcg@10 | 0.8165 |
| cosine_mrr@10 | 0.7728 |
| cosine_map@100 | 0.7753 |
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
query: Digitalisierung am Beispiel Smart Home. In diesem Kurs erfahren Sie, was ein Smart Home ausmacht, welche Wünsche und Ziele ein Benutzer üblicherweise mit dem Smart Home verbindet und was davon mit aktuellen Produkten schon realisierbar ist. Darüber hinaus bekommen Sie Hintergrundwissen über die technischen Herausforderungen und verschiedene Lösungsansätze mit ihren Vor- und Nachteilen. Dazu |
|
gehört auch eine kritische Betrachtung von Sicherheitslücken und Datenschutz, |
|
sowie Vendor Lock-In. Schließlich können Sie entdecken, was Smart Home mit Digitalisierung zu tun hat, wie Digitalisierung in anderen Branchen schon gewirkt hat und was für Auswirkungen in Branchen zu erwarten sind, wo die Digitalisierung gerade erst anfängt.Hier geht's zum Kursportrait in der OPEN vhb-Themenwelt: https://open.vhb.org/themenwelt/kursportraits/smart-home/ |
passage: information and communication technologies not elsewhere classified. Information technology studies not fitting in the detailed fields are classified here: Artificial intelligence |
query: BHT Kurscontainer 3 . Dies ist der Kurscontainer für Gruppe 3. |
passage: business and administration not further defined. |
query: Prompt-Labor Hochschullehre – Anwendungen. Der erfolgreiche Grundlagenkurs Prompt-Labor findet hier seine Fortsetzung mit neuen anwendungsorientierten Einheiten rund um das Thema Generative KI in der Hochschullehre . Freuen Sie sich auf spannende Beiträge und neue Gesichter im Kurs Prompt-Labor Anwendungen! |
passage: education science. Education science is the study of the learning process and the theories, methods and techniques of imparting knowledge to others. Programmes and qualifications with the following main content are classified here: Curriculum studies Didactics Educational assessment, testing and measurement Educational evaluation and research Paedagogical sciences |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim",
"gather_across_devices": false
}
per_device_train_batch_size: 16gradient_accumulation_steps: 2learning_rate: 2e-05weight_decay: 0.01num_train_epochs: 12lr_scheduler_type: cosinewarmup_ratio: 0.1save_only_model: Truefp16: Trueoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 16per_device_eval_batch_size: 8per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 2eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 2e-05weight_decay: 0.01adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 12max_steps: -1lr_scheduler_type: cosinelr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Truerestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}| Epoch | Step | Training Loss | iscedf_cosine_ndcg@10 |
|---|---|---|---|
| 1 | 10 | - | 0.6893 |
| 3 | 20 | - | 0.7603 |
| 4 | 30 | - | 0.7893 |
| 6 | 40 | - | 0.8048 |
| 8 | 50 | 1.1322 | 0.8110 |
| 9 | 60 | - | 0.8165 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Base model
intfloat/multilingual-e5-base