SuperHyperGraph Attention Networks

Main Article Content

Takaaki Fujita
Arif Mehmood

Abstract

Graph Attention Networks (GAT) employ self-attention to aggregate neighboring node features in
graphs, effectively capturing structural dependencies. HyperGraph Attention Networks (HGAT) extend this
mechanism to hypergraphs by alternating attention-based vertex-to-hyperedge and hyperedge-to-vertex up-
dates, modeling higher-order relationships. In this work, we introduce the n-SuperHyperGraph Attention Net-
work, which leverages SuperHyperGraphs—a hierarchical generalization of hypergraphs—to perform multi-tier
attention among supervertices and superedges. Our investigation is purely theoretical; empirical validation via
computational experiments is left for future study

Downloads

Download data is not yet available.

Article Details

How to Cite
SuperHyperGraph Attention Networks. (2025). Neutrosophic Computing and Machine Learning. ISSN 2574-1101, 40(1), 10-27. https://fs.unm.edu/NCML2/index.php/112/article/view/867
Section
Articles

How to Cite

SuperHyperGraph Attention Networks. (2025). Neutrosophic Computing and Machine Learning. ISSN 2574-1101, 40(1), 10-27. https://fs.unm.edu/NCML2/index.php/112/article/view/867