The task of sequence modeling, especially text sequences, has been in the last years the main focus of research in the field of machine learning. At the heart of these models, and of this research topic, we find attention mechanisms. This thesis tries to present and analyze various kinds and different interpretations of attention mechanisms, by comparing them and exploring the key issues that afflict these mechanisms and that limit the efficiency, development and adoption of models such as large language models (LLM).
La modellazione di sequenze, in particolar modo testuali, è stata negli ultimi anni il focus principale della ricerca nell' ambito machine learning. Al cuore di questi modelli, e di questo filone di ricerca, si trovano i meccanismi di attenzione. Questa tesi vuole presentare ed analizzare alcune tipologie e diverse interpretazioni di meccanismi di attenzione, con comparazioni ed esplorando i problemi chiave che affliggono questi meccanismi e che limitano l'efficienza, lo sviluppo e l'adozione di modelli quali i large language models (LLM).
Meccanismi di attenzione per la modellazione efficiente di sequenze nei modelli di machine learning
LUCCHINI, ALBERTO
2024/2025
Abstract
The task of sequence modeling, especially text sequences, has been in the last years the main focus of research in the field of machine learning. At the heart of these models, and of this research topic, we find attention mechanisms. This thesis tries to present and analyze various kinds and different interpretations of attention mechanisms, by comparing them and exploring the key issues that afflict these mechanisms and that limit the efficiency, development and adoption of models such as large language models (LLM).| File | Dimensione | Formato | |
|---|---|---|---|
|
Lucchini_Alberto.pdf
accesso aperto
Dimensione
1.27 MB
Formato
Adobe PDF
|
1.27 MB | Adobe PDF | Visualizza/Apri |
The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License
https://hdl.handle.net/20.500.12608/89358