← Home

Daily arXiv Papers

Graph Neural Networks · Graph Learning · LLM × Graph

Showing 15 papers for 2026-03-12

Estimating condition number with Graph Neural Networks
GNN Graph Learning

This paper presents a fast method to estimate the condition number of sparse matrices using graph neural networks. It introduces a feature-engineering design for GNNs that achieves O(nnz + n) time for training and inference. The authors also propose two prediction schemes for estimating condition numbers under 1-norm and 2-norm, and report extensive experiments comparing the schemes.

GaLoRA: Parameter-Efficient Graph-Aware LLMs for Node Classification
GNN Graph Learning LLM × Graph

The work studies text-attributed graphs (TAGs) and combines large language models with graph neural networks for node classification. It introduces parameter-efficient graph-aware LLMs that leverage textual content and graph structure with few extra parameters. The approach aims to improve node classification performance while remaining computationally efficient.

Causal Concept Graphs in LLM Latent Space for Stepwise Reasoning
LLM × Graph Graph Theory

Causal Concept Graphs (CCG) introduce a directed acyclic graph over sparse latent features, where edges encode causal dependencies between concepts. The method combines task-conditioned sparse autoencoders for concept discovery with differentiable structure learning to recover the graph, and proposes the Causal Fidelity Score (CFS) to evaluate graph-guided interventions in stepwise reasoning.

Graph-GRPO: Training Graph Flow Models with Reinforcement Learning
Graph Learning

Graph-GRPO presents an online reinforcement learning framework for training Graph Flow Models (GFMs) using verifiable rewards. It addresses aligning discrete flow-based graph generators with human preferences and task objectives to enable controllable graph generation. The paper highlights two main contributions: an RL-based training loop and objective-aligned GFM optimization.

Spatio-Temporal Attention Graph Neural Network: Explaining Causalities With Attention
GNN Graph Learning

STA-GNN proposes a Spatio-Temporal Attention Graph Neural Network for unsupervised and explainable anomaly detection in Industrial Control Systems. The model uses spatio-temporal attention to highlight salient patterns and provides explanations to aid deployment and trust, even under baseline drift.

Resource-constrained Amazons chess decision framework integrating large language models and graph attention
GNN Graph Learning LLM × Graph

The paper presents a resource-efficient hybrid framework for the Game of Amazons by integrating large language models with graph attention. Designed for resource-constrained environments, it explores weak-to-strong generalization with limited data and computation by leveraging structured representations. The approach aims to achieve competitive play with reduced resource requirements.

Towards Intelligent Spectrum Management: Spectrum Demand Estimation Using Graph Neural Networks
GNN Graph Learning

Towards intelligent spectrum management, the authors build a spectrum demand proxy from public deployment records and employ a hierarchical multi-resolution graph attention network (HR-GAT) to estimate demand at fine spatial scales. The model captures neighborhood effects and multiscale spatial dependencies to improve demand estimation.

BLITZRANK: Principled Zero-shot Ranking Agents with Tournament Graphs
Graph Theory

BLITZRANK introduces a principled zero-shot ranking framework based on tournament graphs for k-wise comparisons. The key idea is that a k-item comparison reveals a complete tournament of pairwise preferences, enabling more informative rankings. This provides an efficient, theory-guided approach to selecting the top-m items.

Resource Allocation in Hybrid Radio-Optical IoT Networks using GNN with Multi-task Learning
GNN Graph Learning

This work addresses resource allocation in hybrid RF and optical IoT networks using a graph neural network with multi-task learning. The proposed approach tackles joint throughput maximization and AoI minimization under energy and link-availability constraints, and scales better under partial observability by sharing representations across tasks.

Benchmarking Graph Neural Networks in Solving Hard Constraint Satisfaction Problems
GNN Graph Learning

The paper benchmarks GNNs on hard constraint satisfaction problems and proposes new hard benchmarks based on random instances. Through experiments, classical heuristics outperform current GNNs on truly hard problems, highlighting the challenges and suggesting directions for improvement.

Evaluating Progress in Graph Foundation Models: A Comprehensive Benchmark and New Insights
Graph Learning

This work evaluates progress in Graph Foundation Models with a comprehensive benchmark that jointly tests topic and format gaps. By varying both what the graphs describe and how they are represented, the benchmark reveals how knowledge transfers across both dimensions and provides new insights.

AMB-DSGDN: Adaptive Modality-Balanced Dynamic Semantic Graph Differential Network for Multimodal Emotion Recognition
Graph Learning

AMB-DSGDN is an Adaptive Modality-Balanced Dynamic Semantic Graph Differential Network for multimodal emotion recognition. It dynamically balances modalities and models evolving semantic graphs to capture inter- and intra-speaker emotional dynamics, addressing noise and dominance among modalities.

Structured Linked Data as a Memory Layer for Agent-Orchestrated Retrieval
Knowledge Graph Graph Learning

Structured Linked Data as a Memory Layer for Agent-Orchestrated Retrieval investigates using structured linked data (Schema.org, Linked Data Platform) to enhance retrieval in RAG systems. The study conducts controlled experiments across editorial, legal, travel, and e-commerce domains using Vertex AI Vector Search to measure improvements.

Maximum entropy temporal networks
Graph Theory

Maximum entropy temporal networks introduces a maximum-entropy framework for temporal networks, yielding ensembles defined by global time processes and static edge probabilities. This time-edge factorization leads to modular models with closed-form log-likelihoods, enabling principled temporal network analysis.

A Hypergraph-Based Framework for Exploratory Business Intelligence
Graph Theory

A Hypergraph-Based Framework for Exploratory Business Intelligence (ExBI) uses a hypergraph data model with operators like Source, Join, and View to support dynamic schema evolution and reusable materialized views. The approach aims to reduce reliance on domain experts and lower computational costs in Exploratory BI.