Spiking Neural Networks(SNNs) are increasingly preferred over Artificial Neural Networks(ANNs). The inherent sparsity of SNNs makes them more energy-efficient, as neurons communicate only when they generate spikes, unlike ANNs where neurons are continuously activated. SNNs more accurately reflect the biological information processing of the brain, allowing them to effectively capture and model temporal dynamics. In this study, we compare three different spiking neuron models: the Leaky Integrate and Fire neuron model, the Alpha neuron model; and the Synaptic neuron model. To address the dead neuron problem, surrogate gradient descent is employed across all models during training. These models are evaluated based on two metrics: computational complexity and accuracy to determine which offers the best performance. Computational complexity is measured using FLOPs (Floating Point Operations)which measure the number of arithmetic operations(additions and multiplication) required for input processing. A validation dataset is used to determine the optimal timestep for each model, after which a single, consistent time step is selected for a fair comparison of FLOPs and accuracy.

Comparative Analysis of Spiking Neuron Models: Evaluating Computational Complexity and Accuracy

TAJBAKHSH, GOLNAZ
2023/2024

Abstract

Spiking Neural Networks(SNNs) are increasingly preferred over Artificial Neural Networks(ANNs). The inherent sparsity of SNNs makes them more energy-efficient, as neurons communicate only when they generate spikes, unlike ANNs where neurons are continuously activated. SNNs more accurately reflect the biological information processing of the brain, allowing them to effectively capture and model temporal dynamics. In this study, we compare three different spiking neuron models: the Leaky Integrate and Fire neuron model, the Alpha neuron model; and the Synaptic neuron model. To address the dead neuron problem, surrogate gradient descent is employed across all models during training. These models are evaluated based on two metrics: computational complexity and accuracy to determine which offers the best performance. Computational complexity is measured using FLOPs (Floating Point Operations)which measure the number of arithmetic operations(additions and multiplication) required for input processing. A validation dataset is used to determine the optimal timestep for each model, after which a single, consistent time step is selected for a fair comparison of FLOPs and accuracy.
2023
Comparative Analysis of Spiking Neuron Models: Evaluating Computational Complexity and Accuracy
snnTorch
FLOPs
Accuracy
File in questo prodotto:
File Dimensione Formato  
TAJBAKHSH_GOLNAZ.pdf

accesso riservato

Dimensione 9.31 MB
Formato Adobe PDF
9.31 MB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/76995