GNNs & Hardware

Graph Neural Networks (GNNs) are booming across domains, and efficient execution is becoming as important as the algorithms themselves. Below is a categorized list of notable publications — surveys, FPGA frameworks, quantization methods, and core GNN models — each linked to its official source.


Surveys & Reviews

A Review of FPGA-based Graph Neural Network Accelerator Architectures — ACM
Focused on FPGA-targeted GNN accelerators; provides taxonomy and representative architectures.
A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and Customized Hardware — arXiv
Comprehensive survey covering algorithmic optimizations, system-level strategies, and custom hardware for GNN acceleration.
A Survey on Graph Neural Network Acceleration: A Hardware Perspective — IEEE
Focuses on hardware-level techniques and trade-offs when accelerating GNNs.
A Survey of Computationally Efficient Graph Neural Networks for Reconfigurable Systems — MDPI
Explores algorithmic strategies that make GNNs efficient on reconfigurable platforms such as FPGAs.

FPGA Frameworks & Accelerators

GenGNN: A Generic FPGA Framework for Graph Neural Network Acceleration — arXiv
A modular, high-level synthesis (HLS) framework for accelerating multiple GNN models on FPGAs.
DGNN-Booster: A Generic FPGA Accelerator Framework For Dynamic Graph Neural Network Inference — IEEE
Targets dynamic graphs with changing topology; optimizes inference for time-varying data structures.
Hardware Acceleration of Graph Neural Networks — IEEE
Explores various hardware optimization strategies to improve the performance of GNN computation.

Quantization & Hardware-Aware Training

Approximation- and Quantization-Aware Training for Graph Neural Networks — IEEE
Introduces training-time approximations to maintain performance when deploying low-precision GNNs.
Deep Quantization of Graph Neural Networks with Run-Time Hardware-Aware Training — Springer
Explores deep quantization and training methods that adapt to runtime hardware constraints.

GNN Architectures & Theory

Graph Attention Networks (GAT) — arXiv
Introduced attention mechanisms to GNNs, allowing learnable weighting of neighboring nodes.
Heterogeneous Graph Attention Network — ACM
Extends GAT to heterogeneous graphs with multiple node and edge types.
EGAT: Edge-Featured Graph Attention Network — Springer
Incorporates edge features directly into the attention mechanism for richer representations.
Transformers are Graph Neural Networks — arXiv (2025)
Highlights the structural equivalence between Transformers and GNNs, unifying both under a message-passing view.

Applications & Learning Paradigms

Few-Shot Learning with Graph Neural Networks — OpenReview
Demonstrates how GNNs can perform few-shot learning by representing examples as a graph and propagating relational information.

Quick Takeaways