GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval
Paper
•
2112.07577
•
Published
This is the zero-shot baseline model in the paper "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval"
The training setup:
distilbert-base-uncased;sentence-transformers/msmarco-distilbert-base-v3 and sentence-transformers/msmarco-MiniLM-L-6-v3;cross-encoder/ms-marco-MiniLM-L-6-v2 for 70K steps with batch size 75, max. sequence-length 350.