This paper proposes a Web service classification method based on graph neural network knowledge distillation.
This method, based on graph neural networks' information distillation, can construct a service relationship network by first utilizing information from API ...
By seamlessly integrating both the graph structure and node/edge features into the modeling, GNN is a natural fit for graph-based problems.
People also ask
What is knowledge distillation in neural network?
Is Graph neural network supervised or unsupervised?
Connected Papers is a visual tool to help researchers and applied scientists find academic papers relevant to their field of work.
May 9, 2023 · We propose a representation and classification method based on a pre-trained language model and graph neural network, named PLM-GNN.
Missing: Service Distillation.
A general logits-based method for learning MLPs on graphs
www.sciencedirect.com › article › abs › pii
We named this logits-based method Decoupled Graph Knowledge Distillation (DGKD). It can flexibly adjust the weights of TCGD and NCGD for different data samples.
Feb 22, 2024 · This paper proposes a Web service recommendation method via combining knowledge distillation representation and DCNMIX quality prediction.
We propose a lightweight optimization method for GNNs that combines graph contrastive learning and variable-temperature knowledge distillation.
Secondly, a graph neural network knowledge distillation framework for Web service classification consisting of a teacher model and a ...
In this study, a graph representation learning model, which seamlessly incorporates graph neural network (GNN) and knowledge distillation (KD) techniques, named ...
Missing: Web | Show results with:Web