Advancing Graph Neural Networks: Expressivity Enhancement, Information Flow Optimization, and Dynamic Process Modelling

Loading...
Thumbnail Image

Date

Authors

Hevapathige, Asela

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Graph Neural Networks have emerged as powerful tools for learning representations from graph structured data, yet fundamental limitations constrain their effectiveness across diverse applications. This thesis addresses three critical challenges in graph representation learning: improving structural expressivity, optimizing information propagation, and modeling dynamic processes. First, we address the expressivity limitations of standard message passing neural networks, which are bounded by the 1 Weisfeiler Lehman test. Using permutation invariant graph partitioning, we develop Graph Partitioning Neural Networks that capture complex structural interactions by explicitly modeling how different graph components interact. Our theoretical analysis demonstrates that this approach efficiently approaches 3 WL expressivity while remaining computationally tractable. Next, we develop two complementary frameworks for adaptive depth allocation in GNNs. The first uses learnable Bakry Emery curvature to capture both structural properties and diffusion dynamics, showing that nodes with higher curvature require fewer message passing iterations for effective representation learning. The second provides theoretical foundations linking neighborhood characteristics to optimal aggregation strategies under different homophily conditions, revealing that aggregation requirements depend on the balance between same label neighbors, opposite label neighbors, and local degree distribution. Both frameworks remove the need for separate architectures for homophilic and heterophilic graphs. We then address limitations of existing diffusion based GNNs through two new frameworks. The Generalized Opinion Dynamics Neural Framework unifies multiple opinion dynamics models, incorporating node specific stubbornness, dynamic neighborhood influence, and structural regularization to enable diverse convergence behaviors including single consensus, multi consensus, and individualized consensus. For influence maximization, we develop Deep Sheaf Networks that model propagation as a sheaf diffusion reaction process, introducing pointwise and coupled dynamics operators to capture both intrinsic node updates and neighborhood interactions across progressive and non progressive diffusion models. A subgraph based optimization method reduces the combinatorial search space while accounting for overlapping influence among nodes. In summary, the theoretical contributions of this thesis provide rigorous foundations for understanding structural interactions, information flow optimization, and dynamic process modeling in graph neural networks. Extensive experiments across node classification, graph classification, graph regression, influence estimation, and influence maximization tasks show that our approaches consistently outperform existing methods.

Description

Keywords

Citation

Source

Book Title

Entity type

Access Statement

License Rights

Restricted until

Downloads

File
Description