This thesis addresses the challenge of enhancing the performance of Neural Networks (NN) for one-class classification (OCC) problems, motivated by the computational constraints of One-Class Support Vector Machines (OCSVM) and the theoretical framework of the Generalized Likelihood Ratio Test with Unknown Alternative (GLRT-UA). Despite the proven convergence of OCSVMs to GLRT-UA under certain conditions, their performance is limited by computational inefficiency in high-dimensional data scenarios. To overcome this limitation, we propose a novel approach that uses an artificial dataset generated from the output of OCSVM to train NN, aiming to replicate the decision boundaries defined by OCSVM with improved computational efficiency. Through experimental simulation and false alarm miss-detected tests, we demonstrate that the proposed method enhances the performance of NNs, making them a good alternative in scenarios where OCSVMs are computationally impractical. This thesis bridges the theoretical and practical aspects of one-class classification, and also opens new possibilities for research and application in machine learning, especially in high-dimensional data contexts. Future directions include refining the proposed approach and extending its applicability to diverse datasets and problem settings.

This thesis addresses the challenge of enhancing the performance of Neural Networks (NN) for one-class classification (OCC) problems, motivated by the computational constraints of One-Class Support Vector Machines (OCSVM) and the theoretical framework of the Generalized Likelihood Ratio Test with Unknown Alternative (GLRT-UA). Despite the proven convergence of OCSVMs to GLRT-UA under certain conditions, their performance is limited by computational inefficiency in high-dimensional data scenarios. To overcome this limitation, we propose a novel approach that uses an artificial dataset generated from the output of OCSVM to train NN, aiming to replicate the decision boundaries defined by OCSVM with improved computational efficiency. Through experimental simulation and false alarm miss-detected tests, we demonstrate that the proposed method enhances the performance of NNs, making them a good alternative in scenarios where OCSVMs are computationally impractical. This thesis bridges the theoretical and practical aspects of one-class classification, and also opens new possibilities for research and application in machine learning, especially in high-dimensional data contexts. Future directions include refining the proposed approach and extending its applicability to diverse datasets and problem settings.

Fully Connected Neural Networks for GLRT Test: A Synthetic Data Approach

WEHBE, IYAD
2023/2024

Abstract

This thesis addresses the challenge of enhancing the performance of Neural Networks (NN) for one-class classification (OCC) problems, motivated by the computational constraints of One-Class Support Vector Machines (OCSVM) and the theoretical framework of the Generalized Likelihood Ratio Test with Unknown Alternative (GLRT-UA). Despite the proven convergence of OCSVMs to GLRT-UA under certain conditions, their performance is limited by computational inefficiency in high-dimensional data scenarios. To overcome this limitation, we propose a novel approach that uses an artificial dataset generated from the output of OCSVM to train NN, aiming to replicate the decision boundaries defined by OCSVM with improved computational efficiency. Through experimental simulation and false alarm miss-detected tests, we demonstrate that the proposed method enhances the performance of NNs, making them a good alternative in scenarios where OCSVMs are computationally impractical. This thesis bridges the theoretical and practical aspects of one-class classification, and also opens new possibilities for research and application in machine learning, especially in high-dimensional data contexts. Future directions include refining the proposed approach and extending its applicability to diverse datasets and problem settings.
2023
Fully Connected Neural Networks for GLRT Test: A Synthetic Data Approach
This thesis addresses the challenge of enhancing the performance of Neural Networks (NN) for one-class classification (OCC) problems, motivated by the computational constraints of One-Class Support Vector Machines (OCSVM) and the theoretical framework of the Generalized Likelihood Ratio Test with Unknown Alternative (GLRT-UA). Despite the proven convergence of OCSVMs to GLRT-UA under certain conditions, their performance is limited by computational inefficiency in high-dimensional data scenarios. To overcome this limitation, we propose a novel approach that uses an artificial dataset generated from the output of OCSVM to train NN, aiming to replicate the decision boundaries defined by OCSVM with improved computational efficiency. Through experimental simulation and false alarm miss-detected tests, we demonstrate that the proposed method enhances the performance of NNs, making them a good alternative in scenarios where OCSVMs are computationally impractical. This thesis bridges the theoretical and practical aspects of one-class classification, and also opens new possibilities for research and application in machine learning, especially in high-dimensional data contexts. Future directions include refining the proposed approach and extending its applicability to diverse datasets and problem settings.
Neural Networks
One class SVM
GLRT Test
File in questo prodotto:
File Dimensione Formato  
Wehbe_Iyad.pdf

accesso riservato

Dimensione 728.93 kB
Formato Adobe PDF
728.93 kB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/62294