File size: 2,703 Bytes
3ea1bae
 
596feb2
2deb445
596feb2
 
a810b92
 
3ea1bae
 
 
 
 
 
 
 
42b6e39
3ea1bae
42b6e39
3ea1bae
 
42b6e39
3ea1bae
 
42b6e39
 
 
 
 
 
 
 
 
 
 
3ea1bae
42b6e39
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
---
license: apache-2.0
pipeline_tag: text-ranking
library_name: lightning-ir
base_model:
- google/electra-large-discriminator
tags:
- cross-encoder
---

# Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders

This model is presented in the paper [Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders](https://huggingface.co/papers/2404.06912). It's a cross-encoder architecture designed for efficient and permutation-invariant passage re-ranking.

Code: https://github.com/webis-de/set-encoder

We provide the following pre-trained models for general-purpose re-ranking.

To reproduce the results, run the following command using the [Lightning IR](https://github.com/webis-de/lightning-ir) library and the configuration files from the repository repository linked above:

```bash
lightning-ir re_rank --config ./configs/re-rank.yaml --model.model_name_or_path <MODEL_NAME>
```

(nDCG@10 on TREC DL 19 and TREC DL 20)

| Model Name                                                                               | TREC DL 19 (BM25) | TREC DL 20 (BM25) | TREC DL 19 (ColBERTv2) | TREC DL 20 (ColBERTv2) |
| ---------------------------------------------------------------------------------------- | ----------------- | ----------------- | ---------------------- | ---------------------- |
| [webis/set-encoder-base](https://huggingface.co/webis/set-encoder-base)                  | 0.746             | 0.704             | 0.781                  | 0.768                  |
| [webis/set-encoder-large](https://huggingface.co/webis/set-encoder-large)                | 0.750             | 0.722             | 0.789                  | 0.791                  |


## Citation

If you use this code or the models in your research, please cite our paper:

```bibtex
@InProceedings{schlatt:2025,
  address =                  {Berlin Heidelberg New York},
  author =                   {Ferdinand Schlatt and Maik Fr{\"o}be and Harrisen Scells and Shengyao Zhuang and Bevan Koopman and Guido Zuccon and Benno Stein and Martin Potthast and Matthias Hagen},
  booktitle =                {Advances in Information Retrieval. 47th European Conference on IR Research (ECIR 2025)},
  doi =                      {10.1007/978-3-031-88711-6_1},
  month =                    apr,
  publisher =                {Springer},
  series =                   {Lecture Notes in Computer Science},
  site =                     {Lucca, Italy},
  title =                    {{Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders}},
  year =                     2025
}