Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning

Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
https://doi.org/10.48693/517
Open Access logo originally created by the Public Library of Science (PLoS)
Titel: Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning
Autor(en): Malihi, Leila
Heidemann, Gunther
Zusammenfassung: Efficient model deployment is a key focus in deep learning. This has led to the exploration of methods such as knowledge distillation and network pruning to compress models and increase their performance. In this study, we investigate the potential synergy between knowledge distillation and network pruning to achieve optimal model efficiency and improved generalization. We introduce an innovative framework for model compression that combines knowledge distillation, pruning, and fine-tuning to achieve enhanced compression while providing control over the degree of compactness. Our research is conducted on popular datasets, CIFAR-10 and CIFAR-100, employing diverse model architectures, including ResNet, DenseNet, and EfficientNet. We could calibrate the amount of compression achieved. This allows us to produce models with different degrees of compression while still being just as accurate, or even better. Notably, we demonstrate its efficacy by producing two compressed variants of ResNet 101: ResNet 50 and ResNet 18. Our results reveal intriguing findings. In most cases, the pruned and distilled student models exhibit comparable or superior accuracy to the distilled student models while utilizing significantly fewer parameters.
Bibliografische Angaben: Malihi L, Heidemann G.: Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning. Big Data and Cognitive Computing. 2023; 7(3):154.
URL: https://doi.org/10.48693/517
https://osnadocs.ub.uni-osnabrueck.de/handle/ds-2024022810921
Schlagworte: knowledge distillation; teacher; student; model compression; parameter; pruning
Erscheinungsdatum: 19-Sep-2023
Lizenzbezeichnung: Attribution 4.0 International
URL der Lizenz: http://creativecommons.org/licenses/by/4.0/
Publikationstyp: Einzelbeitrag in einer wissenschaftlichen Zeitschrift [Article]
Enthalten in den Sammlungen:FB08 - Hochschulschriften
Open-Access-Publikationsfonds

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
Malihi_Heidemann_BDCC-07-00154_2023.pdfArticle3,08 MBAdobe PDF
Malihi_Heidemann_BDCC-07-00154_2023.pdf
Miniaturbild
Öffnen/Anzeigen


Diese Ressource wurde unter folgender Copyright-Bestimmung veröffentlicht: Lizenz von Creative Commons Creative Commons