← Home

Daily arXiv Papers

Graph Neural Networks · Graph Learning · LLM × Graph

Showing 9 papers for 2026-04-02

Epileptic Seizure Detection in Separate Frequency Bands Using Feature Analysis and Graph Convolutional Neural Network (GCN) from Electroencephalogram (EEG) Signals
GNN Graph Learning

We propose a frequency-aware epilepsy seizure detection framework that analyzes ictal-phase EEG signals using features extracted from separate frequency bands and a graph convolutional network to capture spatial-temporal dynamics. The approach aims to improve interpretability and neurophysiological relevance compared with black-box DL models.

Is One Token All It Takes? Graph Pooling Tokens for LLM-based GraphQA
GNN Graph Learning LLM × Graph

To alleviate the information bottleneck in LLM-based GraphQA, we introduce graph pooling tokens that encode structural graph information more faithfully than simple mean pooling. Our method replaces or augments the single-token bottleneck with a pool of tokens that summarize substructures, enabling better reasoning over complex graphs.

A Cross-graph Tuning-free GNN Prompting Framework
GNN Graph Learning

We present a Cross-graph Tuning-free Prompting Framework (CTP) for GNNs that works on both homogeneous and heterogeneous graphs and can be deployed to unseen graphs without any parameter tuning. CTP enables plug-in prompting without task-specific fine-tuning, addressing generalization across graphs.

Generalization Bounds for Spectral GNNs via Fourier Domain Analysis
GNN Graph Learning Graph Theory

We study generalization for spectral GNNs in the Fourier domain, where each layer acts as a frequency-wise update with fixed spectrum and trainable parameters. We show Gaussian complexity is invariant under the Graph Fourier Transform and derive depth- and order-aware, data-dependent generalization bounds along with stability results.

EmbedPart: Embedding-Driven Graph Partitioning for Scalable Graph Neural Network Training
GNN Graph Learning Graph Theory

We propose EmbedPart, an embedding-driven graph partitioning method to enable scalable distributed training of GNNs. By leveraging node embeddings to guide partitioning, EmbedPart balances computational load while reducing inter-machine communication, addressing the classic trade-off between partitioning overhead and quality.

A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and Customized Hardware
GNN Graph Learning

This survey reviews acceleration techniques for GNNs across algorithms, systems, and customized hardware, addressing scalability and latency challenges in real-world applications. It covers training and inference optimizations, system-level designs, and hardware accelerators that accelerate GNN workloads.

BN-Pool: Bayesian Nonparametric Pooling for Graphs
GNN Graph Learning

We introduce BN-Pool, Bayesian Nonparametric Pooling for Graphs, the first clustering-based pooling method that adaptively determines the number of supernodes. The method uses a generative Bayesian nonparametric model to partition nodes into an unbounded set of clusters, with training that combines supervised task loss and an unsupervised term to reconstruct topology.

ARCS: Autoregressive Circuit Synthesis with Topology-Aware Graph Attention and Spec Conditioning
GNN Graph Learning

ARCS (Autoregressive Circuit Synthesis) is a system for fast, SPICE-simulatable analog circuit generation, using a hybrid pipeline of a graph VAE and a flow-matching model, plus SPICE-based ranking. It achieves 99.9% simulation validity across 32 topologies using only 8 SPICE evaluations, vastly reducing search time.

Exact Graph Learning via Integer Programming
Graph Learning

Exact Graph Learning via Integer Programming proposes using integer programming to perform exact graph learning/causal discovery, addressing limitations of current greedy or approximate methods that are sensitive to assumptions. The approach provides a framework for learning dependence structures without restrictive modeling assumptions.