← Home

Daily arXiv Papers

Graph Neural Networks · Graph Learning · LLM × Graph

Showing 10 papers for 2026-02-19

Investigating GNN Convergence on Large Randomly Generated Graphs with Realistic Node Feature Correlations
GNN Graph Learning

This work argues that prior results on GNN convergence on large random graphs ignore node feature correlations common in real networks. It proposes a novel random graph generator that embeds realistic node-feature correlations and uses it to study how such correlations affect GNN convergence and expressive power in large graphs. The findings indicate that feature correlations can significantly alter convergence behavior compared with i.i.d. feature models.

Graph neural network for colliding particles with an application to sea ice floe modeling
GNN Graph Learning

This paper models sea ice floes as graph nodes with edges capturing physical interactions, including collisions. They introduce the Collision-Captured Network (CN), a GNN that integrates data assimilation to improve dynamic forecasting. Compared to traditional numerical solvers, CN offers scalable, efficient collision-aware dynamics modeling.

Hardware-accelerated graph neural networks: an alternative approach for neuromorphic event-based audio classification and keyword spotting on SoC FPGA
GNN Graph Learning

The authors present an FPGA-based implementation of event-graph neural networks for neuromorphic audio processing on a SoC device. By using a cochlea-inspired front-end to convert time-series data into sparse events, the approach reduces memory and computation. The hardware design enables low-latency, energy-efficient audio classification and keyword spotting.

Including Node Textual Metadata in Laplacian-constrained Gaussian Graphical Models
Graph Learning Graph Theory

The paper extends Laplacian-constrained Gaussian Graphical Models to jointly leverage node signals and associated textual metadata. It provides an optimization formulation and develops an efficient majorization-minimization algorithm for joint graph and metadata estimation. Experiments show improved graph recovery and predictive performance when metadata is incorporated.

Edge-Local and Qubit-Efficient Quantum Graph Learning for the NISQ Era
Graph Learning

This work proposes a fully quantum graph convolutional architecture tailored for unsupervised learning on NISQ devices. The model uses an edge-local design and qubit-efficient representations to reduce circuit depth and qubit counts, combining a variational feature extraction layer with quantum graph operations. It demonstrates practical quantum graph learning on near-term hardware while balancing resource constraints.

Cardinality-Preserving Attention Channels for Graph Transformers in Molecular Property Prediction
GNN Graph Learning

CardinalityGraphFormer introduces a cardinality-preserving attention (CPA) channel for graph transformers to retain dynamic signal cardinalities alongside static centrality embeddings. It blends structured sparse attention with Graphormer-inspired biases and dual-objective self-supervised pretraining (masked reconstruction and contrastive learning). Results show improved molecular property predictions, especially with limited data.

Leveraging Large Language Models for Causal Discovery: a Constraint-based, Argumentation-driven Approach
Graph Learning Graph Theory

The paper explores using large language models to assist causal discovery within a constraint-based, argumentation-driven framework. Causal Assumption-based Argumentation (ABA) ensures correspondence between input constraints and output graphs and enables principled data-expert knowledge integration. The approach provides a scalable, interpretable path to combine symbolic reasoning with observational data for causal graphs.

Federated Graph AGI for Cross-Border Insider Threat Intelligence in Government Financial Schemes
Graph Learning

FedGraph-AGI introduces a federated learning framework for cross-border insider threat intelligence in government financial schemes. It addresses privacy constraints, cross-jurisdiction data sharing, and the need to reason about complex multi-step attack patterns within graph-structured financial networks. The approach enables privacy-preserving collaboration to enhance threat intelligence across borders.

GDGB: A Benchmark for Generative Dynamic Text-Attributed Graph Learning
Graph Learning

GDGB is proposed as a benchmark for generative dynamic text-attributed graphs (DyTAGs). It addresses poor textual quality in existing datasets and standardizes tasks and evaluation protocols for generative DyTAG problems, providing high-quality textual attributes and unified evaluation to foster progress in generative modeling on DyTAGs.

Expressive Power of Graph Transformers via Logic
GNN Graph Learning

This study analyzes the expressive power of graph transformers, comparing real-number and floating-point regimes under soft attention and average hard attention. It shows that with real numbers, GTs restricted to first-order-logic-definable vertex properties have expressiveness aligned with FO logic, and discusses how these findings extend to practical floating-point settings, shedding light on logical definability limits of GTs on graphs.