"Self-Knowledge Distillation: A Simple Way for Better Generalization."

Kyungyul Kim et al. (2020)

Details and statistics

DOI:

access: open

type: Informal or Other Publication

metadata version: 2020-06-23