The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch when the considered set of tasks is updated. Additionally, it reduces memory usage, which is particularly important in situations where privacy limitations exist, such as in the healthcare sector where storing patient data for a long time is prohibited. However, the main challenge of incremental learning is catastrophic forgetting, which causes a decline in the performance of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed. In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We start with a pre-existing approach and then introduce extensions to improve its performance. Experimental results show that the model’s performance is not too significantly impacted by the sequence in which classes are presented, but the complexity of the incremental tasks plays a crucial role in determining the model’s performance. Additionally, starting with a higher number of classes typically results in better performance.

The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch when the considered set of tasks is updated. Additionally, it reduces memory usage, which is particularly important in situations where privacy limitations exist, such as in the healthcare sector where storing patient data for a long time is prohibited. However, the main challenge of incremental learning is catastrophic forgetting, which causes a decline in the performance of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed. In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We start with a pre-existing approach and then introduce extensions to improve its performance. Experimental results show that the model’s performance is not too significantly impacted by the sequence in which classes are presented, but the complexity of the incremental tasks plays a crucial role in determining the model’s performance. Additionally, starting with a higher number of classes typically results in better performance.

Analysis of the impact of class ordering in class incremental image classification

ABDELKARIM, MARWA MOHAMEDOSMAN HUSSEIN
2022/2023

Abstract

The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch when the considered set of tasks is updated. Additionally, it reduces memory usage, which is particularly important in situations where privacy limitations exist, such as in the healthcare sector where storing patient data for a long time is prohibited. However, the main challenge of incremental learning is catastrophic forgetting, which causes a decline in the performance of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed. In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We start with a pre-existing approach and then introduce extensions to improve its performance. Experimental results show that the model’s performance is not too significantly impacted by the sequence in which classes are presented, but the complexity of the incremental tasks plays a crucial role in determining the model’s performance. Additionally, starting with a higher number of classes typically results in better performance.
2022
Analysis of the impact of class ordering in class incremental image classification
The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch when the considered set of tasks is updated. Additionally, it reduces memory usage, which is particularly important in situations where privacy limitations exist, such as in the healthcare sector where storing patient data for a long time is prohibited. However, the main challenge of incremental learning is catastrophic forgetting, which causes a decline in the performance of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed. In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We start with a pre-existing approach and then introduce extensions to improve its performance. Experimental results show that the model’s performance is not too significantly impacted by the sequence in which classes are presented, but the complexity of the incremental tasks plays a crucial role in determining the model’s performance. Additionally, starting with a higher number of classes typically results in better performance.
Class Incremental
Class ordering
Self supervision
Prototype Augment
Deep features space
File in questo prodotto:
File Dimensione Formato  
Abdelkarim_Marwa.pdf

accesso aperto

Dimensione 16.1 MB
Formato Adobe PDF
16.1 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/46067