This thesis presents a study of neural network architectures, tracing their development from foundational models into scattering neural networks and generalized scattering networks. The work is divided into four main chapters, which focus progressively on a different type of neural network and its unique contributions to the field of machine learning. The first chapter delves into Feed-Forward Neural Networks (FFNNs), beginning with a historical overview that charts the evolution of these networks from their inception to modern applications, then exploring the basics of Learning Theory, providing the theoretical underpinnings necessary to understand the learning processes of FFNNs. It concludes with an in-depth description of Feed-Forward Neural Networks, discussing their structure and key attributes. The second chapter shifts focus to Convolutional Neural Networks (CNNs), which are pivotal in the field of image processing. It starts with an overview of image processing and the role of CNNs in enhancing these methods. The section on matrix convolutions elaborates on the mathematical operations that underpin CNNs, followed by a discussion on the relationship between CNNs and FFNNs. This chapter also covers pooling operators, which are essential for reducing the dimensionality of feature maps and improving computational efficiency. The third chapter opens with a discussion on Geometric Priors for Neural Networks, emphasizing the importance of incorporating prior knowledge into network design, then it delves into Scattering networks, an approach that uses wavelets to create invariant representations of data, highlighting its benefits in various applications. The final chapter expands upon the concepts introduced in the previous chapter, discussing Generalized Scattering Networks, providing an overview of the generalization process followed by the introduction of pooling operators in a continuous framework. The chapter ends with an analysis of how these networks achieve robustness to small deformations.

Aspetti del Geometric Learning con le Convolutional Neural Networks

DOS ANJOS VIEIRA FERREIRA, VITORIA
2023/2024

Abstract

This thesis presents a study of neural network architectures, tracing their development from foundational models into scattering neural networks and generalized scattering networks. The work is divided into four main chapters, which focus progressively on a different type of neural network and its unique contributions to the field of machine learning. The first chapter delves into Feed-Forward Neural Networks (FFNNs), beginning with a historical overview that charts the evolution of these networks from their inception to modern applications, then exploring the basics of Learning Theory, providing the theoretical underpinnings necessary to understand the learning processes of FFNNs. It concludes with an in-depth description of Feed-Forward Neural Networks, discussing their structure and key attributes. The second chapter shifts focus to Convolutional Neural Networks (CNNs), which are pivotal in the field of image processing. It starts with an overview of image processing and the role of CNNs in enhancing these methods. The section on matrix convolutions elaborates on the mathematical operations that underpin CNNs, followed by a discussion on the relationship between CNNs and FFNNs. This chapter also covers pooling operators, which are essential for reducing the dimensionality of feature maps and improving computational efficiency. The third chapter opens with a discussion on Geometric Priors for Neural Networks, emphasizing the importance of incorporating prior knowledge into network design, then it delves into Scattering networks, an approach that uses wavelets to create invariant representations of data, highlighting its benefits in various applications. The final chapter expands upon the concepts introduced in the previous chapter, discussing Generalized Scattering Networks, providing an overview of the generalization process followed by the introduction of pooling operators in a continuous framework. The chapter ends with an analysis of how these networks achieve robustness to small deformations.
ENG
IMPORT DA TESIONLINE
File in questo prodotto:
File Dimensione Formato  
962374_tesidosanjos962374.pdf

non disponibili

Tipologia: Altro materiale allegato
Dimensione 2.74 MB
Formato Adobe PDF
2.74 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14240/111951