Inventor(s)

Abstract

Due to the capability to capture high-order information of nodes and reduce memory consumption, implicit graph neural networks have become an explored hotspot in recent years. However, these implicit graph neural networks are limited by the static topology, which makes it difficult to handle heterophilic graph-structured data. Furthermore, the existing methods inspired by optimization objectives are limited by the explicit structure of graph neural networks, which makes it difficult to set an appropriate number of network layers to solve optimization problems. To address these issues, we propose an implicit graph neural network with flexible propagation operators in this paper. From the optimization objective function, we derive an implicit message passing formula with flexible propagation operators. Compared to the static operator, the proposed method that joints the dynamic semantic and topology of data is more applicable to heterophilic graphs. Moreover, the proposed model performs a fixed-point iterative process for the optimization of the objective function, which implicitly adjusts the number of network layers without requiring sufficient prior knowledge. Extensive experiment results demonstrate the superiority of the proposed model.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS