Distilling dense representations for ranking using tightly-coupled teachers

SC Lin, JH Yang, J Lin - arXiv preprint arXiv:2010.11386, 2020 - arxiv.org
… an approach to ranking with dense representations that applies knowledge distillation to
improve the recently proposed lateinteraction ColBERT model. Specifically, we distill the …

In-batch negatives for knowledge distillation with tightly-coupled teachers for dense retrieval

SC Lin, JH Yang, J Lin - … of the 6th Workshop on Representation …, 2021 - aclanthology.org
… with dense representations that applies knowledge distillation using the ColBERT late-interaction
ranking … from a bi-encoder teacher to a student by distilling knowledge from ColBERT’s …

Efficiently teaching an effective dense retriever with balanced topic aware sampling

S Hofstätter, SC Lin, JH Yang, J Lin… - … and Development in …, 2021 - dl.acm.org
… In the BERTCAT ranking model, the query 𝑞1:𝑚 and passage … With tightly coupled teachers
(TCT), Lin et al. [24] showed … We observe that the methods not using knowledge distillation

Improving efficient neural ranking models with cross-architecture knowledge distillation

S Hofstätter, S Althammer, M Schröder… - arXiv preprint arXiv …, 2020 - arxiv.org
… RQ3 How effective is our distillation for dense nearest … (using the ; operator) and the CLS
token representation computed … the teacher models have a very high pairwise ranking accuracy …

Curriculum learning for dense retrieval distillation

H Zeng, H Zamani, V Vinay - … on Research and Development in …, 2022 - dl.acm.org
… high-dimensional dense learned representations for queries and … As the re-ranking teacher
model, we use the MiniLM cross… Tightly-Coupled Teachers. Proceedings of the 6th Workshop …

Training on the Test Model: Contamination in Ranking Distillation

VS Kalal, A Parry, S MacAvaney - arXiv preprint arXiv:2411.02284, 2024 - arxiv.org
ranking models applying contextualized representations are … use of teacher output and
RankNet-style order distillation … Measures We primarily assess densely annotated test collections …

Lead: liberal feature-based distillation for dense retrieval

H Sun, X Liu, Y Gong, A Dong, J Lu, Y Zhang… - Proceedings of the 17th …, 2024 - dl.acm.org
… widely-used benchmarks, including MS MARCO Passage Ranking, … distillation with
tightly-coupled teachers for dense retrieval. In Proceedings of the 6th Workshop on Representation

Prod: Progressive distillation for dense retrieval

Z Lin, Y Gong, X Liu, H Zhang, C Lin, A Dong… - Proceedings of the …, 2023 - dl.acm.org
… (CE) is often used as the re-ranking model, rearranging the retriever … distillation with
tightly-coupled teachers for dense retrieval. In Proceedings of the 6th Workshop on Representation

DiSCo Meets LLMs: A Unified Approach for Sparse Retrieval and Contextual Distillation in Conversational Search

S Lupart, M Aliannejadi, E Kanoulas - arXiv preprint arXiv:2410.14609, 2024 - arxiv.org
representations compared to those of the original teacher … of the teacher with the student
model, combining the rankedDistillation with Tightly-Coupled Teachers for Dense Retrieval…

Balanced Knowledge Distillation with Contrastive Learning for Document Re-ranking

Y Yang, S He, Y Qiao, W Xie, T Yang - Proceedings of the 2023 ACM …, 2023 - dl.acm.org
… architecture to produce dense document representations [12, 29… Recent studies in dense
retrieval [45, 52, 52] have also adopted … to distill knowledge from a teacher model for re-ranking. …