Continual Learning (CL) aims to learn new tasks incrementally while mitigating catastrophic forgetting. Online Continual Learning (OCL) narrows this focus to learning efficiently from a continuous, dynamically shifting data stream. Although recent studies have explored Continual Learning on graphs using Graph Neural Networks (GNNs), few address the unique challenges of streaming settings, where real-world graphs often evolve over time and require timely (online) predictions. However, current approaches to node classification are not well aligned with standard OCL frameworks, partly due to the lack of a clear definition of online Continual Learning for graphs. In this thesis, we propose a comprehensive framework for online Continual Learning on graphs, emphasizing efficient batch processing that accounts for graph topology. We introduce benchmark datasets for online continual graph learning and report the performance of various CL methods adapted to this framework. We further tackle the memory challenges posed by multi-hop neighborhood aggregation in GNNs, as it can require access to an ever-expanding graph, which leads to prohibitive costs in this setting. To address this, we introduce a sampling strategy to control complexity for efficient online learning. Finally, we investigate the potential of combining an untrained feature extractor with a linear classifier to mitigate forgetting.

Online Lifelong Learning on Streaming Graphs

DONGHI, GIOVANNI
2022/2023

Abstract

Continual Learning (CL) aims to learn new tasks incrementally while mitigating catastrophic forgetting. Online Continual Learning (OCL) narrows this focus to learning efficiently from a continuous, dynamically shifting data stream. Although recent studies have explored Continual Learning on graphs using Graph Neural Networks (GNNs), few address the unique challenges of streaming settings, where real-world graphs often evolve over time and require timely (online) predictions. However, current approaches to node classification are not well aligned with standard OCL frameworks, partly due to the lack of a clear definition of online Continual Learning for graphs. In this thesis, we propose a comprehensive framework for online Continual Learning on graphs, emphasizing efficient batch processing that accounts for graph topology. We introduce benchmark datasets for online continual graph learning and report the performance of various CL methods adapted to this framework. We further tackle the memory challenges posed by multi-hop neighborhood aggregation in GNNs, as it can require access to an ever-expanding graph, which leads to prohibitive costs in this setting. To address this, we introduce a sampling strategy to control complexity for efficient online learning. Finally, we investigate the potential of combining an untrained feature extractor with a linear classifier to mitigate forgetting.
2022
Online Lifelong Learning on Streaming Graphs
continual learning
online learning
graph neural network
File in questo prodotto:
File Dimensione Formato  
Tesi_SGSS.pdf

accesso riservato

Dimensione 4.05 MB
Formato Adobe PDF
4.05 MB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/76690