Data Science

HPE HGT Hybrid Positional Enoding for Enhanced Structural Awareness in Heterogenous Graph Transform

MTSU Computational and Data Science Ph. D. student Nada Srour presents some of her work that explores approaches for optimizing heterogeneous neural networks in machine learning. Heterogeneous graph neural networks (HGNNs) are critical for modeling multi-relational data in domains such as recommender systems, biological networks, cybersecurity, and citation analysis. These graphs encode diverse node and edge types, capturing complex semantic interactions that demand expressive and type-aware representation learning. While recent graph transformer-based architectures have advanced this goal by leveraging attention over typed meta-paths, their effectiveness remains constrained by a fundamental limitation: an overemphasis on semantic structure at the expense of global topological awareness. To overcome this gap, we introduce HPE-HGT, a novel hybrid positional encoding-based heterogeneous graph transformer that integrates both local semantic and global structural signals into the attention process.

Watch the webinar here.