Mar 11, 2024 · We propose a simple but effective method AuG-KD. It utilizes an uncertainty-guided and sample-specific anchor to align student-domain data with the teacher ...
Nov 21, 2023 · A method named AuG-KD is proposed to employ a data-driven anchor to align student-domain data with the teacher domain, using a generative ...
Jun 10, 2024 · AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation. June 2024. Conference: The Twelfth International Conference on ...
Our method utilizes an uncertainty-driven and sample- specific anchor to align student-domain data with Dt and leverage a generative method to progres- sively ...
People also ask
What is the knowledge distillation theory?
It introduces AuG-KD, a method that aligns student-domain data with the teacher domain using an anchor-based mixup generation approach. Extensive experiments ...
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation ... This work proposes a simple but effective method AuG-KD that utilizes an ...
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation. T Zihao, Z Lv, S Zhang, Y Zhou, X Duan, F Wu, K Kuang. The Twelfth ...
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation ... domains a model that can generalize well to unknown target domains.
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation ... 在本文中,我们提出了一种简单而有效的方法AuG-KD。它利用基于不确定性和样本 ...
In this work, we propose a simple but effective method AuG-KD. It utilizes an uncertainty-guided and sample-specific anchor to align student-domain data with ...