×
Oct 5, 2022 · This paper studies the problem of how to automatically, adaptively, and dynamically learn instance-level self-supervised learning strategies for each node.
Oct 5, 2022 · In this paper, we propose a novel multi-teacher knowledge distillation framework for Automated Graph Self-Supervised Learning (AGSSL), which ...
Oct 5, 2022 · In this paper, we propose a novel multi-teacher knowledge distillation framework for. Automated Graph Self-Supervised Learning (AGSSL), which ...
Homophily-Enhanced Self-supervision for Graph Structure Learning ... Automated graph self-supervised learning via multi-teacher knowledge distillation.
This repository contains a list of papers on the Self-supervised Learning on Graph Neural Networks (GNNs), we categorize them based on their published years.
In this paper, we propose a novel multi-teacher knowledge distillation framework for Automated Graph Self-Supervised Learning (AGSSL), which consists of two ...
This paper is concerned with self-supervised learning for small models. Knowledge Distillation · Self-Supervised Learning +1.
Extracting low-/high-frequency knowledge from graph neural networks and injecting it into MLPs: an effective GNN-to-MLP distillation framework
This paper studies the problem of how to automatically, adaptively, and dynamically learn instance-level self-supervised learning ...
Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation · no code implementations • 5 Oct 2022 • Lirong Wu, Yufei Huang, Haitao Lin, ...
People also ask