BingoCGN employs cross-partition message quantization to summarize inter-partition message flow, which eliminates the need for irregular off-chip memory access and utilizes a fine-grained structured ...
SM-GNN prunes multi-view GNNs to pure propagation, cutting training time while outperforming prior MKGC accuracies on two ...
High sparse Knowledge Graph is a key challenge to solve the Knowledge Graph Completion task. Due to the sparsity of the KGs, there are not enough first-order neighbors to learn the features of ...
Learn about the most prominent types of modern neural networks such as feedforward, recurrent, convolutional, and transformer networks, and their use cases in modern AI. Neural networks are the ...
Representing a molecule in a way that captures both its structure and function is central to tasks such as molecular property prediction, drug drug ...
Adapting to the Stream: An Instance-Attention GNN Method for Irregular Multivariate Time Series Data
DynIMTS replaces static graphs with instance-attention that updates edge weights on the fly, delivering SOTA imputation and P12 classification ...
A team led by Guoyin Yin at Wuhan University and the Shanghai Artificial Intelligence Laboratory recently proposed a modular machine learning ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results