Inventor(s)

Abstract

As Transformers gain traction in graph machine learning, the issue of "over-globalization" has emerged, where their global attention mechanisms excessively emphasize distant vertices, potentially diluting vital local information. This is particularly detrimental in graphs where local neighborhoods hold significant predictive value. Existing methods often lack flexibility in local processing or fail to effectively integrate local and global contexts. This paper introduces BalancedGraphFormer, a novel framework designed to localize graph transformer training. It integrates a dedicated local module with a complementary global module. The local module captures fine-grained neighborhood patterns, while the global module integrates broader context without overshadowing local details. Utilizing collaborative and warm-up training strategies, these modules synergistically mitigate overglobalization, improving empirical performance. Experimental results on vertex classification tasks demonstrate BalancedGraphFormer's effectiveness compared to state-of-the-art baselines in addressing the over-globalization challenge.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS