fschlatt commited on
Commit
75749a7
·
verified ·
1 Parent(s): ebdf003

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -16
README.md CHANGED
@@ -8,29 +8,42 @@ tags:
8
  - cross-encoder
9
  ---
10
 
11
- # Set-Encoder
12
 
13
- This repository contains the code for the paper: [`Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders`](https://arxiv.org/abs/2404.06912).
14
 
15
- We use [`lightning-ir`](https://github.com/webis-de/lightning-ir) to train and fine-tune models. Download and install the library to use the code in this repository.
16
 
17
- ## Model Zoo
18
 
19
- We provide the following pre-trained models:
20
 
21
- | Model Name | TREC DL 19 (BM25) | TREC DL 20 (BM25) | TREC DL 19 (ColBERTv2) | TREC DL 20 (ColBERTv2) |
22
- | ------------------------------------------------------------------- | ----------------- | ----------------- | ---------------------- | ---------------------- |
23
- | [set-encoder-base](https://huggingface.co/webis/set-encoder-base) | 0.724 | 0.710 | 0.788 | 0.777 |
24
- | [set-encoder-large](https://huggingface.co/webis/set-encoder-large) | 0.727 | 0.735 | 0.789 | 0.790 |
25
 
26
- ## Inference
27
 
28
- We recommend using the `lightning-ir` cli to run inference. The following command can be used to run inference using the `set-encoder-base` model on the TREC DL 19 and TREC DL 20 datasets:
 
 
 
29
 
30
- ```bash
31
- lightning-ir re_rank --config configs/re-rank.yaml --config configs/set-encoder-finetuned.yaml --config configs/trec-dl.yaml
32
- ```
33
 
34
- ## Fine-Tuning
 
 
35
 
36
- WIP
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  - cross-encoder
9
  ---
10
 
11
+ # Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders
12
 
13
+ This model is presented in the paper [Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders](https://huggingface.co/papers/2404.06912). It's a cross-encoder architecture designed for efficient and permutation-invariant passage re-ranking.
14
 
15
+ Code: https://github.com/webis-de/set-encoder
16
 
17
+ We provide the following pre-trained models for general-purpose re-ranking.
18
 
19
+ To reproduce the results, run the following command using the [Lightning IR](https://github.com/webis-de/lightning-ir) library and the configuration files from the repository repository linked above:
20
 
21
+ ```bash
22
+ lightning-ir re_rank --config ./configs/re-rank.yaml --model.model_name_or_path <MODEL_NAME>
23
+ ```
 
24
 
25
+ (nDCG@10 on TREC DL 19 and TREC DL 20)
26
 
27
+ | Model Name | TREC DL 19 (BM25) | TREC DL 20 (BM25) | TREC DL 19 (ColBERTv2) | TREC DL 20 (ColBERTv2) |
28
+ | ---------------------------------------------------------------------------------------- | ----------------- | ----------------- | ---------------------- | ---------------------- |
29
+ | [webis/set-encoder-base](https://huggingface.co/webis/set-encoder-base) | 0.746 | 0.704 | 0.781 | 0.768 |
30
+ | [webis/set-encoder-large](https://huggingface.co/webis/set-encoder-large) | 0.750 | 0.722 | 0.789 | 0.791 |
31
 
 
 
 
32
 
33
+ ## Citation
34
+
35
+ If you use this code or the models in your research, please cite our paper:
36
 
37
+ ```bibtex
38
+ @InProceedings{schlatt:2025,
39
+ address = {Berlin Heidelberg New York},
40
+ author = {Ferdinand Schlatt and Maik Fr{\"o}be and Harrisen Scells and Shengyao Zhuang and Bevan Koopman and Guido Zuccon and Benno Stein and Martin Potthast and Matthias Hagen},
41
+ booktitle = {Advances in Information Retrieval. 47th European Conference on IR Research (ECIR 2025)},
42
+ doi = {10.1007/978-3-031-88711-6_1},
43
+ month = apr,
44
+ publisher = {Springer},
45
+ series = {Lecture Notes in Computer Science},
46
+ site = {Lucca, Italy},
47
+ title = {{Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders}},
48
+ year = 2025
49
+ }