In this thesis, various methods were explored to reduce the size of connectome networks from different species, with the aim of evaluating their performance in predicting and executing both stochastic and deterministic tasks. These tasks were introduced into echo-state networks as input signals. The biological connectomes were used as reservoir neural networks, and different network attack strategies - i.e., finite-size scaling techniques - were applied to reduce the network size. These methods included classical reservoir computing metrics, random node removal, and a novel approach proposed in this work, termed renormalization-based coarse-graining. In this method, the most functionally related nodes were merged during each attack iteration. Initially, these attack strategies were tested on synthetic neural networks, and the most effective methods were identified. The same procedures were then applied to real connectomes from various species, including the human brain. Although the results varied across species, a shared power-law behavior was consistently observed in the performance vs. reservoir size plots within a specific range. These findings suggest that a relatively small percentage of highly interconnected nodes can retain the ability to perform non-random (deterministic) tasks, even after significant network reduction.

In this thesis, various methods were explored to reduce the size of connectome networks from different species, with the aim of evaluating their performance in predicting and executing both stochastic and deterministic tasks. These tasks were introduced into echo-state networks as input signals. The biological connectomes were used as reservoir neural networks, and different network attack strategies - i.e., finite-size scaling techniques - were applied to reduce the network size. These methods included classical reservoir computing metrics, random node removal, and a novel approach proposed in this work, termed renormalization-based coarse-graining. In this method, the most functionally related nodes were merged during each attack iteration. Initially, these attack strategies were tested on synthetic neural networks, and the most effective methods were identified. The same procedures were then applied to real connectomes from various species, including the human brain. Although the results varied across species, a shared power-law behavior was consistently observed in the performance vs. reservoir size plots within a specific range. These findings suggest that a relatively small percentage of highly interconnected nodes can retain the ability to perform non-random (deterministic) tasks, even after significant network reduction.

Finite size scaling effects in the performance of bio-inspired recurrent neural networks

VATANKHAHAN, MAHDY
2024/2025

Abstract

In this thesis, various methods were explored to reduce the size of connectome networks from different species, with the aim of evaluating their performance in predicting and executing both stochastic and deterministic tasks. These tasks were introduced into echo-state networks as input signals. The biological connectomes were used as reservoir neural networks, and different network attack strategies - i.e., finite-size scaling techniques - were applied to reduce the network size. These methods included classical reservoir computing metrics, random node removal, and a novel approach proposed in this work, termed renormalization-based coarse-graining. In this method, the most functionally related nodes were merged during each attack iteration. Initially, these attack strategies were tested on synthetic neural networks, and the most effective methods were identified. The same procedures were then applied to real connectomes from various species, including the human brain. Although the results varied across species, a shared power-law behavior was consistently observed in the performance vs. reservoir size plots within a specific range. These findings suggest that a relatively small percentage of highly interconnected nodes can retain the ability to perform non-random (deterministic) tasks, even after significant network reduction.
2024
Finite size scaling effects in the performance of bio-inspired recurrent neural networks
In this thesis, various methods were explored to reduce the size of connectome networks from different species, with the aim of evaluating their performance in predicting and executing both stochastic and deterministic tasks. These tasks were introduced into echo-state networks as input signals. The biological connectomes were used as reservoir neural networks, and different network attack strategies - i.e., finite-size scaling techniques - were applied to reduce the network size. These methods included classical reservoir computing metrics, random node removal, and a novel approach proposed in this work, termed renormalization-based coarse-graining. In this method, the most functionally related nodes were merged during each attack iteration. Initially, these attack strategies were tested on synthetic neural networks, and the most effective methods were identified. The same procedures were then applied to real connectomes from various species, including the human brain. Although the results varied across species, a shared power-law behavior was consistently observed in the performance vs. reservoir size plots within a specific range. These findings suggest that a relatively small percentage of highly interconnected nodes can retain the ability to perform non-random (deterministic) tasks, even after significant network reduction.
Finite size scaling
Reservoir Computing
Bio-inspired Network
File in questo prodotto:
File Dimensione Formato  
Vatankhahan_Mahdy.pdf

Accesso riservato

Dimensione 9.24 MB
Formato Adobe PDF
9.24 MB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/94354