The topic of the activity is the exploration of the BERT model, which stands for Bidirectional Encoder Representations from Transformers. It's a sophisticated tool that helps computer programs understand human language in a way that captures the nuances of how words relate to each other. By exploring BERT, we're hoping to shed light on various social issues and topics that are important in sociology, which is the study of society. BERT comes with a set of pre-built models, which are like ready-made tools that have already learned a lot about language from huge amounts of text. These models are great at figuring out the meaning of words in different contexts. Along with these models, BERT uses tokenizers, which are like little helpers that break down sentences into smaller pieces so the computer can understand them better. It also uses embeddings, which turn words into numbers that represent their meaning. This way, we can analyze words in a way that shows their relationships and patterns.

The topic of the activity is the exploration of the BERT model, which stands for Bidirectional Encoder Representations from Transformers. It's a sophisticated tool that helps computer programs understand human language in a way that captures the nuances of how words relate to each other. By exploring BERT, we're hoping to shed light on various social issues and topics that are important in sociology, which is the study of society. BERT comes with a set of pre-built models, which are like ready-made tools that have already learned a lot about language from huge amounts of text. These models are great at figuring out the meaning of words in different contexts. Along with these models, BERT uses tokenizers, which are like little helpers that break down sentences into smaller pieces so the computer can understand them better. It also uses embeddings, which turn words into numbers that represent their meaning. This way, we can analyze words in a way that shows their relationships and patterns.

Textual Embeddings and Neural Networks for Emotion and Slur Prediction: A Comparative Study of BERT Variants

TAVAKOLI, SINA
2024/2025

Abstract

The topic of the activity is the exploration of the BERT model, which stands for Bidirectional Encoder Representations from Transformers. It's a sophisticated tool that helps computer programs understand human language in a way that captures the nuances of how words relate to each other. By exploring BERT, we're hoping to shed light on various social issues and topics that are important in sociology, which is the study of society. BERT comes with a set of pre-built models, which are like ready-made tools that have already learned a lot about language from huge amounts of text. These models are great at figuring out the meaning of words in different contexts. Along with these models, BERT uses tokenizers, which are like little helpers that break down sentences into smaller pieces so the computer can understand them better. It also uses embeddings, which turn words into numbers that represent their meaning. This way, we can analyze words in a way that shows their relationships and patterns.
2024
Textual Embeddings and Neural Networks for Emotion and Slur Prediction: A Comparative Study of BERT Variants
The topic of the activity is the exploration of the BERT model, which stands for Bidirectional Encoder Representations from Transformers. It's a sophisticated tool that helps computer programs understand human language in a way that captures the nuances of how words relate to each other. By exploring BERT, we're hoping to shed light on various social issues and topics that are important in sociology, which is the study of society. BERT comes with a set of pre-built models, which are like ready-made tools that have already learned a lot about language from huge amounts of text. These models are great at figuring out the meaning of words in different contexts. Along with these models, BERT uses tokenizers, which are like little helpers that break down sentences into smaller pieces so the computer can understand them better. It also uses embeddings, which turn words into numbers that represent their meaning. This way, we can analyze words in a way that shows their relationships and patterns.
NLP
BERT
Neural Networks
File in questo prodotto:
File Dimensione Formato  
Thesis.pdf

accesso aperto

Dimensione 1.73 MB
Formato Adobe PDF
1.73 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/93352