Volltext-Downloads (blau) und Frontdoor-Views (grau)
  • search hit 39 of 390
Back to Result List

GSparsity: Unifying Network Pruning and Neural Architecture Search by Group Sparsity

  • In this paper, we propose a unified approach for network pruning and one-shot neural architecture search (NAS) via group sparsity. We first show that group sparsity via the recent Proximal Stochastic Gradient Descent (ProxSGD) algorithm achieves new state-of-the-art results for filter pruning. Then, we extend this approach to operation pruning, directly yielding a gradient-based NAS method basedIn this paper, we propose a unified approach for network pruning and one-shot neural architecture search (NAS) via group sparsity. We first show that group sparsity via the recent Proximal Stochastic Gradient Descent (ProxSGD) algorithm achieves new state-of-the-art results for filter pruning. Then, we extend this approach to operation pruning, directly yielding a gradient-based NAS method based on group sparsity. Compared to existing gradient-based algorithms such as DARTS, the advantages of this new group sparsity approach are threefold. Firstly, instead of a costly bilevel optimization problem, we formulate the NAS problem as a single-level optimization problem, which can be optimally and efficiently solved using ProxSGD with convergence guarantees. Secondly, due to the operation-level sparsity, discretizing the network architecture by pruning less important operations can be safely done without any performance degradation. Thirdly, the proposed approach finds architectures that are both stable and well-performing on a variety of search spaces and datasets.show moreshow less

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Document Type:Conference Proceeding
Conference Type:Konferenzartikel
Zitierlink: https://opus.hs-offenburg.de/6455
Bibliografische Angaben
Title (English):GSparsity: Unifying Network Pruning and Neural Architecture Search by Group Sparsity
Conference:1st Conference on Automated Machine Learning (AutoML 2022), Late-Breaking Workshop, Jul 25 2022, Baltimore, US (co-located with ICML)
Author:Avraam Chatzimichailidis, Arber Zela, Janis KeuperStaff MemberORCiDGND, Yang Yang
Year of Publication:2022
First Page:1
Last Page:24
Parent Title (English):AutoML Conference 2022 Workshop Track
URL:https://openreview.net/forum?id=r0GeE-arUe5
Language:English
Inhaltliche Informationen
Institutes:Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019)
Forschung / IMLA - Institute for Machine Learning and Analytics
Institutes:Bibliografie
Tag:autoML; deep learning; neural architecture search; pruning
Formale Angaben
Relevance:Konferenzbeitrag: h5-Index < 30
Open Access: Open Access 
 Bronze 
Licence (German):License LogoCreative Commons - CC BY - Namensnennung 4.0 International