UniADS: Universal Architecture-Distiller Search for distillation gap

Liming Lu, Zhenghan Chen, Xiaoyu Lu*, Yihang Rao, Lujun Li, Shuchao Pang

*Corresponding author for this work

Research output: Contribution to journalConference paperpeer-review

10 Citations (Scopus)

Abstract

In this paper, we present UniADS, the first Universal Architecture-Distiller Search framework for co-optimizing student architecture and distillation policies. Teacher-student distillation gap limits the distillation gains. Previous approaches seek to discover the ideal student architecture while ignoring distillation settings. In UniADS, we construct a comprehensive search space encompassing an architectural search for student models, knowledge transformations in distillation strategies, distance functions, loss weights, and other vital settings. To efficiently explore the search space, we utilize the NSGA-II genetic algorithm for better crossover and mutation configurations and employ the Successive Halving algorithm for search space pruning, resulting in improved search efficiency and promising results. Extensive experiments are performed on different teacher-student pairs using CIFAR-100 and ImageNet datasets. The experimental results consistently demonstrate the superiority of our method over existing approaches. Furthermore, we provide a detailed analysis of the search results, examining the impact of each variable and extracting valuable insights and practical guidance for distillation design and implementation.

Original languageEnglish
Pages (from-to)14167-14174
Number of pages8
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume38
Issue number2
DOIs
Publication statusPublished - 25 Mar 2024
EventAAAI Conference on Artificial Intelligence (38th : 2024) - Vancouver, Canada
Duration: 20 Feb 202427 Feb 2024

Cite this