Veuillez utiliser cette adresse pour citer ce document : http://repository.enp.edu.dz/jspui/handle/123456789/1919
Affichage complet
Élément Dublin CoreValeurLangue
dc.contributor.authorLaouichi, Anouar-
dc.contributor.otherBerrani, Sid-Ahmed, Directeur de thèse-
dc.contributor.otherYous, Hamza, Directeur de thèse-
dc.date.accessioned2020-12-22T10:14:37Z-
dc.date.available2020-12-22T10:14:37Z-
dc.date.issued2020-
dc.identifier.otherEP00078-
dc.identifier.urihttp://repository.enp.edu.dz/xmlui/handle/123456789/1919-
dc.descriptionMémoire de Projet de Fin d’Études : Électronique : Alger, École Nationale Polytechnique : 2020fr_FR
dc.description.abstractThis project deals with the optimization of Deep Neural Networks for efficientembedded inference. Network Pruning and Quantization techniques are implemented underthe PyTorch environment and benchmarked on ResNet50. The obtained results, consisting ofcompression and speed-up rates, successfully validate the feasibility and the effectiveness of theconcept. To show their practical potential, the two schemes have been applied on RetinaNetobject detector. Additionally, this work demonstrates that inference can be performed at theedge by reducing the model’s memory footprint and the processing time, resulting in reducedlatency and energy consumption as well as improved data security. Hence, new horizons ofapplications in embedded systems are opened upfr_FR
dc.language.isoenfr_FR
dc.subjectArtificial intelligencefr_FR
dc.subjectDeep Neuralfr_FR
dc.subjectEmbedded Systemsfr_FR
dc.subjectInferencefr_FR
dc.subjectNetworksfr_FR
dc.subjectPruningfr_FR
dc.subjectQuantizationfr_FR
dc.subjectObject detectionfr_FR
dc.subjectPytorchfr_FR
dc.titleDeep neural networks optimization for embedded platformsfr_FR
dc.typeThesisfr_FR
Collection(s) :Département Electronique

Fichier(s) constituant ce document :
Fichier Description TailleFormat 
LAOUICHI.Anouar_BENAOUDA.Abderrahim.pdfPN008206.28 MBAdobe PDFVoir/Ouvrir


Tous les documents dans DSpace sont protégés par copyright, avec tous droits réservés.