ModernBERT-base trained on GooAQ
This is a Cross Encoder model finetuned from cross-encoder/ms-marco-MiniLM-L6-v2 using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
Model Details
Model Description
- Model Type: Cross Encoder
- Base model: cross-encoder/ms-marco-MiniLM-L6-v2
- Maximum Sequence Length: 512 tokens
- Number of Output Labels: 1 label
- Language: en
- License: apache-2.0
Model Sources
- Documentation: Sentence Transformers Documentation
- Documentation: Cross Encoder Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Cross Encoders on Hugging Face
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import CrossEncoder
# Download from the ๐ค Hub
model = CrossEncoder("ayushexel/reranker-ms-marco-MiniLM-L6-v2-gooaq-bce-500k")
# Get scores for pairs of texts
pairs = [
['what are the 50 state mottos?', '[\'Maine. "Dirigo" (\\u200bI direct)\', \'44. California. "\\u200bEureka" (I have found it) ... \', \'Arizona. "Ditat Deus" (\\u200bGod Enriches) ... \', \'Indiana. "The Crossroads of America" ... \', \'Alaska. "North to the Future" ... \', \'Utah. "Industry" ... \', \'Delaware. "Liberty and Independence" ... \', \'Maryland. "Fatti maschii, parole femine" (Manly deeds womanly words) ... \']'],
['what does it mean when you have white pee?', 'A milky quality to your urine is typically caused by your body sending an increase in white blood cells to fight an infection. When these white blood exit your body via your urine, the cells mix, and your urine appears cloudy.'],
['what does it mean when you have white pee?', 'White balance (WB) is the process of removing unrealistic color casts, so that objects which appear white in person are rendered white in your photo. Proper camera white balance has to take into account the "color temperature" of a light source, which refers to the relative warmth or coolness of white light.'],
['what does it mean when you have white pee?', "['Lower abdominal pain.', 'Pain during urination.', 'Frequent urination.', 'Difficulty urinating or interrupted urine flow.', 'Blood in the urine.', 'Cloudy or abnormally dark-colored urine.']"],
['what does it mean when you have white pee?', 'Peeps. ... Peeps are marshmallows sold in the United States and Canada that are shaped into chicks, bunnies, and other animals.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'what are the 50 state mottos?',
[
'[\'Maine. "Dirigo" (\\u200bI direct)\', \'44. California. "\\u200bEureka" (I have found it) ... \', \'Arizona. "Ditat Deus" (\\u200bGod Enriches) ... \', \'Indiana. "The Crossroads of America" ... \', \'Alaska. "North to the Future" ... \', \'Utah. "Industry" ... \', \'Delaware. "Liberty and Independence" ... \', \'Maryland. "Fatti maschii, parole femine" (Manly deeds womanly words) ... \']',
'A milky quality to your urine is typically caused by your body sending an increase in white blood cells to fight an infection. When these white blood exit your body via your urine, the cells mix, and your urine appears cloudy.',
'White balance (WB) is the process of removing unrealistic color casts, so that objects which appear white in person are rendered white in your photo. Proper camera white balance has to take into account the "color temperature" of a light source, which refers to the relative warmth or coolness of white light.',
"['Lower abdominal pain.', 'Pain during urination.', 'Frequent urination.', 'Difficulty urinating or interrupted urine flow.', 'Blood in the urine.', 'Cloudy or abnormally dark-colored urine.']",
'Peeps. ... Peeps are marshmallows sold in the United States and Canada that are shaped into chicks, bunnies, and other animals.',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
Evaluation
Metrics
Cross Encoder Reranking
- Dataset:
gooaq-dev
- Evaluated with
CrossEncoderRerankingEvaluator
with these parameters:{ "at_k": 10, "always_rerank_positives": false }
Metric | Value |
---|---|
map | 0.5832 (+0.2028) |
mrr@10 | 0.5818 (+0.2114) |
ndcg@10 | 0.6298 (+0.1971) |
Cross Encoder Reranking
- Datasets:
NanoMSMARCO_R100
,NanoNFCorpus_R100
andNanoNQ_R100
- Evaluated with
CrossEncoderRerankingEvaluator
with these parameters:{ "at_k": 10, "always_rerank_positives": true }
Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
---|---|---|---|
map | 0.4501 (-0.0395) | 0.3711 (+0.1101) | 0.3764 (-0.0432) |
mrr@10 | 0.4371 (-0.0404) | 0.5112 (+0.0114) | 0.3779 (-0.0488) |
ndcg@10 | 0.5122 (-0.0282) | 0.3773 (+0.0523) | 0.4386 (-0.0621) |
Cross Encoder Nano BEIR
- Dataset:
NanoBEIR_R100_mean
- Evaluated with
CrossEncoderNanoBEIREvaluator
with these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ], "rerank_k": 100, "at_k": 10, "always_rerank_positives": true }
Metric | Value |
---|---|
map | 0.3992 (+0.0091) |
mrr@10 | 0.4421 (-0.0260) |
ndcg@10 | 0.4427 (-0.0127) |
Training Details
Training Dataset
Unnamed Dataset
- Size: 3,648,749 training samples
- Columns:
question
,answer
, andlabel
- Approximate statistics based on the first 1000 samples:
question answer label type string string int details - min: 18 characters
- mean: 43.54 characters
- max: 83 characters
- min: 53 characters
- mean: 248.29 characters
- max: 400 characters
- 0: ~86.10%
- 1: ~13.90%
- Samples:
question answer label what are the 50 state mottos?
['Maine. "Dirigo" (\u200bI direct)', '44. California. "\u200bEureka" (I have found it) ... ', 'Arizona. "Ditat Deus" (\u200bGod Enriches) ... ', 'Indiana. "The Crossroads of America" ... ', 'Alaska. "North to the Future" ... ', 'Utah. "Industry" ... ', 'Delaware. "Liberty and Independence" ... ', 'Maryland. "Fatti maschii, parole femine" (Manly deeds womanly words) ... ']
1
what does it mean when you have white pee?
A milky quality to your urine is typically caused by your body sending an increase in white blood cells to fight an infection. When these white blood exit your body via your urine, the cells mix, and your urine appears cloudy.
1
what does it mean when you have white pee?
White balance (WB) is the process of removing unrealistic color casts, so that objects which appear white in person are rendered white in your photo. Proper camera white balance has to take into account the "color temperature" of a light source, which refers to the relative warmth or coolness of white light.
0
- Loss:
BinaryCrossEntropyLoss
with these parameters:{ "activation_fn": "torch.nn.modules.linear.Identity", "pos_weight": 7 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 2048per_device_eval_batch_size
: 2048learning_rate
: 2e-05warmup_ratio
: 0.1seed
: 12bf16
: Truedataloader_num_workers
: 12load_best_model_at_end
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 2048per_device_eval_batch_size
: 2048per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 12data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 12dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size
: 0fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | gooaq-dev_ndcg@10 | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 |
---|---|---|---|---|---|---|---|
-1 | -1 | - | 0.5865 (+0.1538) | 0.6686 (+0.1282) | 0.3930 (+0.0680) | 0.7599 (+0.2592) | 0.6072 (+0.1518) |
0.0006 | 1 | 2.0734 | - | - | - | - | - |
0.1122 | 200 | 1.4177 | - | - | - | - | - |
0.2245 | 400 | 0.7296 | - | - | - | - | - |
0.3367 | 600 | 0.6662 | - | - | - | - | - |
0.4489 | 800 | 0.6445 | - | - | - | - | - |
0.5612 | 1000 | 0.621 | 0.6176 (+0.1849) | 0.5862 (+0.0458) | 0.4371 (+0.1121) | 0.4987 (-0.0020) | 0.5073 (+0.0520) |
0.6734 | 1200 | 0.6122 | - | - | - | - | - |
0.7856 | 1400 | 0.6031 | - | - | - | - | - |
0.8979 | 1600 | 0.5944 | - | - | - | - | - |
1.0101 | 1800 | 0.5846 | - | - | - | - | - |
1.1223 | 2000 | 0.5647 | 0.6222 (+0.1895) | 0.5471 (+0.0066) | 0.4028 (+0.0778) | 0.4703 (-0.0304) | 0.4734 (+0.0180) |
1.2346 | 2200 | 0.5636 | - | - | - | - | - |
1.3468 | 2400 | 0.5587 | - | - | - | - | - |
1.4590 | 2600 | 0.5543 | - | - | - | - | - |
1.5713 | 2800 | 0.5559 | - | - | - | - | - |
1.6835 | 3000 | 0.5496 | 0.6242 (+0.1915) | 0.4842 (-0.0563) | 0.3852 (+0.0601) | 0.4132 (-0.0874) | 0.4275 (-0.0279) |
1.7957 | 3200 | 0.5426 | - | - | - | - | - |
1.9080 | 3400 | 0.5422 | - | - | - | - | - |
2.0202 | 3600 | 0.5426 | - | - | - | - | - |
2.1324 | 3800 | 0.5311 | - | - | - | - | - |
2.2447 | 4000 | 0.5267 | 0.6291 (+0.1963) | 0.5247 (-0.0158) | 0.3832 (+0.0581) | 0.4469 (-0.0538) | 0.4516 (-0.0038) |
2.3569 | 4200 | 0.526 | - | - | - | - | - |
2.4691 | 4400 | 0.5255 | - | - | - | - | - |
2.5814 | 4600 | 0.5229 | - | - | - | - | - |
2.6936 | 4800 | 0.5206 | - | - | - | - | - |
2.8058 | 5000 | 0.5196 | 0.6298 (+0.1971) | 0.5122 (-0.0282) | 0.3773 (+0.0523) | 0.4386 (-0.0621) | 0.4427 (-0.0127) |
2.9181 | 5200 | 0.5261 | - | - | - | - | - |
-1 | -1 | - | 0.6298 (+0.1971) | 0.5122 (-0.0282) | 0.3773 (+0.0523) | 0.4386 (-0.0621) | 0.4427 (-0.0127) |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.11.0
- Sentence Transformers: 4.0.1
- Transformers: 4.50.3
- PyTorch: 2.6.0+cu124
- Accelerate: 1.5.2
- Datasets: 3.5.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ayushexel/reranker-ms-marco-MiniLM-L6-v2-gooaq-bce-500k
Base model
microsoft/MiniLM-L12-H384-uncased
Quantized
cross-encoder/ms-marco-MiniLM-L12-v2
Quantized
cross-encoder/ms-marco-MiniLM-L6-v2
Evaluation results
- Map on gooaq devself-reported0.583
- Mrr@10 on gooaq devself-reported0.582
- Ndcg@10 on gooaq devself-reported0.630
- Map on NanoMSMARCO R100self-reported0.450
- Mrr@10 on NanoMSMARCO R100self-reported0.437
- Ndcg@10 on NanoMSMARCO R100self-reported0.512
- Map on NanoNFCorpus R100self-reported0.371
- Mrr@10 on NanoNFCorpus R100self-reported0.511
- Ndcg@10 on NanoNFCorpus R100self-reported0.377
- Map on NanoNQ R100self-reported0.376