Machine learning techniques are very important nowadays. The most popular scenario for these problems is the supervised learning one, where a classification task is defined in order to classify new data. The logical approach of defining a classifier is to achieve the one failing less, so that this is equivalent to minimizing an error function. There are some techniques and algorithms used to solve these problems, but the ones used for nondifferentiable functions have a high computational cost. For a specific type of classifiers called Minimax Risk Classifiers, the error function obtained is not differentiable, so we use the theoretical base of different subgradient methods to develop an algorithm for this specific case. The presented techniques allow the optimization problem to be solved efficiently.
Machine learning techniques are very important nowadays. The most popular scenario for these problems is the supervised learning one, where a classification task is defined in order to classify new data. The logical approach of defining a classifier is to achieve the one failing less, so that this is equivalent to minimizing an error function. There are some techniques and algorithms used to solve these problems, but the ones used for nondifferentiable functions have a high computational cost. For a specific type of classifiers called Minimax Risk Classifiers, the error function obtained is not differentiable, so we use the theoretical base of different subgradient methods to develop an algorithm for this specific case. The presented techniques allow the optimization problem to be solved efficiently.
Subgradient methods for the efficient learning of minimax risk classifiers
BERISTAIN CORTES, MAIALEN
2023/2024
Abstract
Machine learning techniques are very important nowadays. The most popular scenario for these problems is the supervised learning one, where a classification task is defined in order to classify new data. The logical approach of defining a classifier is to achieve the one failing less, so that this is equivalent to minimizing an error function. There are some techniques and algorithms used to solve these problems, but the ones used for nondifferentiable functions have a high computational cost. For a specific type of classifiers called Minimax Risk Classifiers, the error function obtained is not differentiable, so we use the theoretical base of different subgradient methods to develop an algorithm for this specific case. The presented techniques allow the optimization problem to be solved efficiently.File | Dimensione | Formato | |
---|---|---|---|
Tesi di Magistrale. Maialen Beristain Cortes.pdf
accesso aperto
Dimensione
1.2 MB
Formato
Adobe PDF
|
1.2 MB | Adobe PDF | Visualizza/Apri |
The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License
https://hdl.handle.net/20.500.12608/71084