← Home

Daily arXiv Papers

Graph Neural Networks · Graph Learning · LLM × Graph

Showing 19 papers for 2026-03-11

Are Expressive Encoders Necessary for Discrete Graph Generation?
GNN Graph Learning

Introduces GenGNN, a modular message-passing framework for discrete graph generation. Diffusion models using GenGNN achieve over 90% validity on Tree and Planar datasets, with 2-5x faster inference than graph transformers; for molecule generation, DiGress with a GenGNN backbone reaches 99.49% validity.

$P^2$GNN: Two Prototype Sets to boost GNN Performance
GNN Graph Learning

Presents P^2GNN, a plug-and-play technique that uses two prototype sets to boost MP-GNN performance. By enriching message passing with global prototype information, it mitigates reliance on local context and noisy neighborhoods, improving robustness in low-homophily settings.

Transductive Generalization via Optimal Transport and Its Application to Graph Node Classification
Graph Learning

Establishes representation-based transductive generalization bounds using optimal transport in a distribution-free setting where test features are available during training. The bounds are expressed via Wasserstein distances between encoded feature distributions and include global and class-wise forms, linking theory to practice.

TA-GGAD: Testing-time Adaptive Graph Model for Generalist Graph Anomaly Detection
GNN Graph Learning

Introduces TA-GGAD, a testing-time adaptive graph model for generalist graph anomaly detection across domains. It analyzes feature mismatch and demonstrates improved detection under cross-domain shifts through test-time adaptation.

GNNs for Time Series Anomaly Detection: An Open-Source Framework and a Critical Evaluation
GNN Graph Learning

Provides an open-source framework for graph-based time series anomaly detection and offers a critical evaluation of current practices. It highlights issues in metric design and interpretation and proposes guidance for standardized evaluation in TSAD using GNNs.

A Graph-Based Approach to Spectrum Demand Prediction Using Hierarchical Attention Networks
Graph Learning

Presents HR-GAT, a hierarchical resolution graph attention network for predicting spectrum demand from geospatial data. It handles complex spatial patterns and mitigates spatial autocorrelation to improve spectrum management decisions.

Provable Filter for Real-world Graph Clustering
Graph Learning GNN

Offers a provable graph clustering method tailored to real-world graphs with heterophily. It provides theoretical guarantees and a principled filtering approach that remains effective beyond traditional homophily assumptions.

Scalable Message Passing Neural Networks: No Need for Attention in Large Graph Representation Learning
GNN Graph Learning

Proposes Scalable MP-NNs (SMPNNs) that drop attention in large graphs by using standard convolutional message passing inside a Transformer-like block with pre-layer normalization. The result is competitive with state-of-the-art graph transformers and scales better in large-scale transductive learning.

RF-Informed Graph Neural Networks for Accurate and Data-Efficient Circuit Performance Prediction
GNN Graph Learning

Introduces RF-Informed GNNs for accurate and data-efficient circuit performance prediction. The framework is topology-aware and data-efficient, achieving accurate predictions with limited datasets versus traditional solvers.

GraphKeeper: Graph Domain-Incremental Learning via Knowledge Disentanglement and Preservation
Graph Learning

GraphKeeper enables graph domain-incremental learning by disentangling and preserving knowledge across domains. It extends graph incremental learning to multi-domain settings, helping graph foundation models adapt to new domains without forgetting prior ones.

SA$^{2}$GFM: Enhancing Robust Graph Foundation Models with Structure-Aware Semantic Augmentation
Graph Learning

SA^2GFM enhances robustness of graph foundation models with Structure-Aware Semantic Augmentation. It encodes hierarchical structural priors to improve domain-adaptive representations and resilience to ambient noise and perturbations.

Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for Robotic Dynamics Learning
GNN Graph Learning

Proposes Morphological-Symmetry-Equivariant HGNN (MS-HGNN) for robotic dynamics learning. It injects morphological priors and symmetry constraints into a heterogeneous graph network, improving generalization, data efficiency, and applicability to multi-body dynamics.

Understanding the Use of a Large Language Model-Powered Guide to Make Virtual Reality Accessible for Blind and Low Vision People
Knowledge Graph Graph Learning

Investigates an LLM-powered guide to make virtual reality accessible for blind and low-vision users. In a study with 16 participants, the guide is treated as a tool when alone but shows social interactions when others are present, offering design implications for accessibility in VR.

MMGraphRAG: Bridging Vision and Language with Interpretable Multimodal Knowledge Graphs
Knowledge Graph Graph Learning

Bridges vision and language with interpretable multimodal knowledge graphs for RAG. MMGraphRAG preserves fine-grained visual structure and cross-modal reasoning paths to reduce hallucinations common in text-centric RAG.

VistaWise: Building Cost-Effective Agent with Cross-Modal Knowledge Graph for Minecraft
GNN Graph Learning

Proposes VistaWise, a cost-effective agent for Minecraft that integrates cross-modal domain knowledge and a dedicated object detector. It reduces the need for domain-specific data and fine-tuning for LLM-powered embodied agents.

LLM-Grounded Explainable AI for Supply Chain Risk Early Warning via Temporal Graph Attention Networks
GNN Graph Learning

Offers LLM-Grounded Explainable AI for supply chain risk early warning by coupling a Temporal Graph Attention Network with an LLM reasoning module. The approach yields both predictive signals and faithful natural-language explanations, demonstrated on maritime hubs.

Debiasing International Attitudes: LLM Agents for Simulating US-China Perception Changes
Graph Theory

Explores using LLM-driven agents to simulate and debias US-China perception changes. The framework disentangles sources of bias and tests whether LLMs can model human-like opinion evolution in response to external information.

SPARC: Spatial-Aware Path Planning via Attentive Robot Communication
Graph Theory

SPARC introduces Spatial-Aware Path Planning via Attentive Robot Communication. It embeds pairwise Manhattan distances into the attention weight computation, enabling each robot to dynamically prioritize messages from spatially relevant neighbors for decentralized multi-robot path planning.

Evaluating the Practical Effectiveness of LLM-Driven Index Tuning with Microsoft Database Tuning Advisor
Graph Theory

Evaluates the practical effectiveness of LLM-driven index tuning with Microsoft Database Tuning Advisor. The study analyzes how well LLMs can predict query costs under different index configurations and discusses limitations and scenarios where the cost estimations diverge from optimizer guidance.