Entropy Aware Message Passing in Graph Neural Networks
Philipp Nazari, Oliver Lemke, Davide Guidobene, and 1 more author
arXiv preprint arXiv:2403.04636, 2024
Deep Graph Neural Networks struggle with oversmoothing. This paper introduces a novel, physics-inspired GNN model designed to mitigate this issue. Our approach integrates with existing GNN architectures, introducing an entropy-aware message passing term. This term performs gradient ascent on the entropy during node aggregation, thereby preserving a certain degree of entropy in the embeddings. We conduct a comparative analysis of our model against state-of-the-art GNNs across various common datasets.