Towards trustworthy dataset distillation

S Ma, F Zhu, Z Cheng, XY Zhang - Pattern Recognition, 2025 - Elsevier
… Motivated by this, in this paper, we consider both efficiency and trustworthiness … We distill
both InD and the generated outliers into a tiny dataset, such a distilled dataset is capable of …

Towards Trustworthy Dataset Distillation: A Benchmark of Privacy, Fairness and Robustness

Z Chen, J Geng, D Zhu, Q Li… - … Joint Conference on …, 2024 - ieeexplore.ieee.org
… propose a trustworthy benchmark for assessing representative dataset distillation solutions
… vulnerabilities inherent in the application of dataset distillation, offering valuable insights for …

Trustal: Trustworthy active learning using knowledge distillation

B Kwak, Y Kim, YJ Kim, S Hwang, J Yeo - Proceedings of the AAAI …, 2022 - ojs.aaai.org
… as a good surrogate for labeled dataset. A naive way to … Trustworthy AL) that enables the
training of consistent acquisition model that serves as a good reference for labeled dataset

Reliable adversarial distillation with unreliable teachers

J Zhu, J Yao, B Han, J Zhang, T Liu, G Niu… - arXiv preprint arXiv …, 2021 - arxiv.org
… guidance of the teacher model in distillation is reliable or not. … distillation is not always
trustworthy. Formally, adversarial … data during adversarial distillation on CIFAR-10 dataset. Noted …

Task-driven causal feature distillation: Towards trustworthy risk prediction

Z Chu, M Hu, Q Cui, L Li, S Li - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
Trustworthy risk predictions with causal interpretability and high precision and recall can be
… on distilled data. We evaluate our TDCFD method on several synthetic and real datasets, …

Ensuring Trustworthiness in Decentralized Systems through Federated Distillation and Feature Mixing

C Chatzikonstantinou, A Psaltis… - … Conference on Big …, 2024 - ieeexplore.ieee.org
Towards this direction, a novel technique known as Fed… IID dataset. A prevalent approach
in the literature involves the use of a public dataset accessible to all clients for local distillation, …

Federated distillation for medical image classification: Towards trustworthy computer-aided diagnosis

S Ren, Y Hu, S Chen, G Wang - arXiv preprint arXiv:2407.02261, 2024 - arxiv.org
… In conclusion, the superiority of FEDMIC under different medical image classification
datasets comes from the dual knowledge distillation mechanism and global parameter …

Towards adversarially robust dataset distillation by curvature regularization

E Xue, Y Li, H Liu, Y Shen, H Wang - arXiv preprint arXiv:2403.10045, 2024 - arxiv.org
… against malicious inputs, making it a crucial aspect of trustworthy machine learning. Given
the potential of dataset distillation to safeguard the privacy of the original dataset (Geng et al., …

A comprehensive survey of dataset distillation

S Lei, D Tao - IEEE Transactions on Pattern Analysis and …, 2023 - ieeexplore.ieee.org
… in compressing datasets, there are still several limitations such as distillingdataset distillation
from multiple aspects, including distillation frameworks and algorithms, factorized dataset

Rethinking data distillation: Do not overlook calibration

D Zhu, B Lei, J Zhang, Y Fang, Y Xie… - Proceedings of the …, 2023 - openaccess.thecvf.com
… calibrate networks trained on data distilled from large source datasets. In this paper, we
show that … Toward trustworthy programming for autonomous concurrent systems. AI & SOCIETY, …