NAM Pruning: Enhance Model Pruning via Network Activations Minimization

DSpace/Manakin Repository

Show simple item record

dc.contributor.advisor Pelillo, Marcello it_IT
dc.contributor.author Lazzaro, Dario <1998> it_IT
dc.date.accessioned 2023-02-18 it_IT
dc.date.accessioned 2023-05-23T12:57:35Z
dc.date.issued 2023-03-16 it_IT
dc.identifier.uri http://hdl.handle.net/10579/23259
dc.description.abstract Deep neural networks have become an essential tool in machine learning due to their excellent performance. However, their sheer amount of parameters can lead to their infeasible application in limited hardware contexts. In this work we propose a novel formulation to enforce sparsity in model activations, decreasing the number of firing neurons to facilitate pruning methods in achieving better results. Our approach aims at reducing the model’s activation density by minimizing their l0 pseudonorm. Although l0 is the most suitable objective function to approximate this value, it is a nonconvex and discontinuous function for which optimization is NP-hard. To achieve the desired behavior we employ a differentiable and unbiased estimate of the actual l0 pseudonorm. Thanks to this estimate we are able to formulate a novel model training objective function that aims to minimize the empirical risk loss by adding sparsity in the model activations. To prove the suitability of our training method we define an iterative pruning schema designed to enforce sparsity in model activations during the pruning steps. Finally, to assert our formulation, we propose our iterative pruning algorithm which gives the ability to define a trade-off between final models accuracy and size. We compare it with two well-known trimming approaches, showing how it can lead to better pruning performance. We empirically assess the effectiveness of our method on two distinct DNNs trained on CIFAR10 and GTSRB. it_IT
dc.language.iso en it_IT
dc.publisher Università Ca' Foscari Venezia it_IT
dc.rights © Dario Lazzaro, 2023 it_IT
dc.title NAM Pruning: Enhance Model Pruning via Network Activations Minimization it_IT
dc.title.alternative NAM Pruning: Enhancing Model Pruning via Network Activations Minimization it_IT
dc.type Master's Degree Thesis it_IT
dc.degree.name Informatica - computer science it_IT
dc.degree.level Laurea magistrale it_IT
dc.degree.grantor Dipartimento di Scienze Ambientali, Informatica e Statistica it_IT
dc.description.academicyear 2021/2022 - appello sessione straordinaria it_IT
dc.rights.accessrights embargoedAccess it_IT
dc.thesis.matricno 869304 it_IT
dc.subject.miur INF/01 INFORMATICA it_IT
dc.description.note it_IT
dc.degree.discipline it_IT
dc.contributor.co-advisor it_IT
dc.date.embargoend 2024-05-22T12:57:35Z
dc.provenance.upload Dario Lazzaro (869304@stud.unive.it), 2023-02-18 it_IT
dc.provenance.plagiarycheck Marcello Pelillo (pelillo@unive.it), 2023-03-06 it_IT


Files in this item

This item appears in the following Collection(s)

Show simple item record