reproducing-cross-encoders
Collection
A set of cross-encoders trained from various backbones and losses for equal comparison • 55 items • Updated
• 3
This model is a cross-encoder based on jhu-clsp/ettin-encoder-150m. It was trained on Ms-Marco using loss hingeLoss as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.
This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).
Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.
Quick Start:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
tokenizer = AutoTokenizer.from_pretrained("jhu-clsp/ettin-encoder-150m")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-ettin-150m-Hinge")
features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.
| dataset | RR@10 | nDCG@10 |
|---|---|---|
| msmarco_dev | 38.26 | 44.88 |
| trec2019 | 96.98 | 75.09 |
| trec2020 | 91.98 | 69.89 |
| fever | 81.90 | 81.41 |
| arguana | 16.83 | 25.15 |
| climate_fever | 27.62 | 20.27 |
| dbpedia | 72.22 | 42.88 |
| fiqa | 48.39 | 40.23 |
| hotpotqa | 84.55 | 67.46 |
| nfcorpus | 55.66 | 34.96 |
| nq | 52.09 | 56.95 |
| quora | 65.37 | 68.50 |
| scidocs | 28.41 | 16.25 |
| scifact | 68.82 | 71.82 |
| touche | 64.70 | 35.97 |
| trec_covid | 92.33 | 78.15 |
| robust04 | 68.72 | 45.20 |
| lotte_writing | 74.43 | 65.16 |
| lotte_recreation | 63.23 | 58.08 |
| lotte_science | 50.55 | 41.48 |
| lotte_technology | 58.66 | 49.55 |
| lotte_lifestyle | 73.19 | 64.08 |
| Mean In Domain | 75.74 | 63.29 |
| BEIR 13 | 58.38 | 49.23 |
| LoTTE (OOD) | 64.80 | 53.93 |
Base model
jhu-clsp/ettin-encoder-150m