As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
We present a novel Graph Neural Networks (GNN) architecture as an simplification of Graph Attentional Network (GAT) model with implicit computation of edge attention coefficients and shared sparse-dense matrix multiplication between heads. These improvements reduce training time and memory consumption while keeping the model capacity of GAT. On several established benchmarks, our model has a performance on par with state-of-the-art, yet with improved efficiency and scalability similar to simpler models including Graph Convolutional Network (GCN). Notably, we are able to apply the model to the large-scale Reddit social network dataset within a reasonable training time and memory constraint, which is previously infeasible for models with similar complexity including GAT.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.