Kolmogorov-Arnold Networks (KANs) have recently gained significant attention in the machine learning community as a promising alternative to Multi-Layer Perceptrons (MLPs). These models are theoretically founded on the Kolmogorov-Arnold representation theorem, and promise greater interpretability, generalization, and accuracy compared to MLPs. Recently, they have been studied across different domains, such as images and text processing, time-series analysis, physics-informed problems, etc. The aim of this work is to explore the applicability of KAN models in the graph domain. Specifically, we study an extension of Graph Neural Networks (GNNs), denoted as Graph Kolmogorov-Arnold Networks (GKANs), which relies on the message-passing framework, but replaces the MLP-based update function with a KAN-based architecture. Our study focuses on two key aspects: the predictive accuracy of GKANs and their interpretability, particularly in the context of spatial-temporal graphs. We first compare GKANs performance with that of state-of-the-art GNN approaches in graph classification tasks, involving static graph datasets. Additionally, we introduce the GKAN Ordinary Differential Equation (GKAN-ODE) framework, in which we leverage the interpretable nature of GKANs to model the evolution of graphs over time and to extract symbolic formulas representing these dynamics. Our experiments highlight the promising capabilities of GKANs, especially in terms of interpretability, while also discussing the limitations and future potential of this approach.

Kolmogorov-Arnold Networks (KANs) have recently gained significant attention in the machine learning community as a promising alternative to Multi-Layer Perceptrons (MLPs). These models are theoretically founded on the Kolmogorov-Arnold representation theorem, and promise greater interpretability, generalization, and accuracy compared to MLPs. Recently, they have been studied across different domains, such as images and text processing, time-series analysis, physics-informed problems, etc. The aim of this work is to explore the applicability of KAN models in the graph domain. Specifically, we study an extension of Graph Neural Networks (GNNs), denoted as Graph Kolmogorov-Arnold Networks (GKANs), which relies on the message-passing framework, but replaces the MLP-based update function with a KAN-based architecture. Our study focuses on two key aspects: the predictive accuracy of GKANs and their interpretability, particularly in the context of spatial-temporal graphs. We first compare GKANs performance with that of state-of-the-art GNN approaches in graph classification tasks, involving static graph datasets. Additionally, we introduce the GKAN Ordinary Differential Equation (GKAN-ODE) framework, in which we leverage the interpretable nature of GKANs to model the evolution of graphs over time and to extract symbolic formulas representing these dynamics. Our experiments highlight the promising capabilities of GKANs, especially in terms of interpretability, while also discussing the limitations and future potential of this approach.

Exploring Kolmogorov-Arnold Networks for Graph Learning

CAPPI, RICCARDO
2023/2024

Abstract

Kolmogorov-Arnold Networks (KANs) have recently gained significant attention in the machine learning community as a promising alternative to Multi-Layer Perceptrons (MLPs). These models are theoretically founded on the Kolmogorov-Arnold representation theorem, and promise greater interpretability, generalization, and accuracy compared to MLPs. Recently, they have been studied across different domains, such as images and text processing, time-series analysis, physics-informed problems, etc. The aim of this work is to explore the applicability of KAN models in the graph domain. Specifically, we study an extension of Graph Neural Networks (GNNs), denoted as Graph Kolmogorov-Arnold Networks (GKANs), which relies on the message-passing framework, but replaces the MLP-based update function with a KAN-based architecture. Our study focuses on two key aspects: the predictive accuracy of GKANs and their interpretability, particularly in the context of spatial-temporal graphs. We first compare GKANs performance with that of state-of-the-art GNN approaches in graph classification tasks, involving static graph datasets. Additionally, we introduce the GKAN Ordinary Differential Equation (GKAN-ODE) framework, in which we leverage the interpretable nature of GKANs to model the evolution of graphs over time and to extract symbolic formulas representing these dynamics. Our experiments highlight the promising capabilities of GKANs, especially in terms of interpretability, while also discussing the limitations and future potential of this approach.
2023
Exploring Kolmogorov-Arnold Networks for Graph Learning
Kolmogorov-Arnold Networks (KANs) have recently gained significant attention in the machine learning community as a promising alternative to Multi-Layer Perceptrons (MLPs). These models are theoretically founded on the Kolmogorov-Arnold representation theorem, and promise greater interpretability, generalization, and accuracy compared to MLPs. Recently, they have been studied across different domains, such as images and text processing, time-series analysis, physics-informed problems, etc. The aim of this work is to explore the applicability of KAN models in the graph domain. Specifically, we study an extension of Graph Neural Networks (GNNs), denoted as Graph Kolmogorov-Arnold Networks (GKANs), which relies on the message-passing framework, but replaces the MLP-based update function with a KAN-based architecture. Our study focuses on two key aspects: the predictive accuracy of GKANs and their interpretability, particularly in the context of spatial-temporal graphs. We first compare GKANs performance with that of state-of-the-art GNN approaches in graph classification tasks, involving static graph datasets. Additionally, we introduce the GKAN Ordinary Differential Equation (GKAN-ODE) framework, in which we leverage the interpretable nature of GKANs to model the evolution of graphs over time and to extract symbolic formulas representing these dynamics. Our experiments highlight the promising capabilities of GKANs, especially in terms of interpretability, while also discussing the limitations and future potential of this approach.
KAN
Graphs
Deep Learning
File in questo prodotto:
File Dimensione Formato  
Cappi_Riccardo.pdf

accesso aperto

Dimensione 1.77 MB
Formato Adobe PDF
1.77 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/80198