MDLdroidLite: a release-and-inhibit control approach to resource-efficient deep neural networks on mobile devices

Yu Zhang, Tao Gu, Xi Zhang

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)


Mobile deep learning (MDL) has emerged as a privacy-preserving learning paradigm for mobile devices. This paradigm offers unique features such as privacy preservation, continual learning and low-latency inference to the building of personal mobile sensing applications. However, squeezing Deep Learning to mobile devices is extremely challenging due to resource constraint. Traditional Deep Neural Networks (DNNs) are usually over-parametered, hence incurring huge resource overhead for on-device learning. In this paper, we present a novel on-device deep learning framework named MDLdroidLite that transforms traditional DNNs into resource-efficient model structures for on-device learning. To minimize resource overhead, we propose a novel release-and-inhibit control (RIC) approach based on Model Predictive Control theory to efficiently grow DNNs from tiny to backbone. We also design a gate-based fast adaptation mechanism for channel-level knowledge transformation to quickly adapt new-born neurons with existing neurons, enabling safe parameter adaptation and fast convergence for on-device training. Our evaluations show that MDLdroidLite boosts on-device training on various PMS datasets with 28× to 50× less model parameters, 4× to 10× less floating number operations than the state-of-the-art model structures while keeping the same accuracy level.

Original languageEnglish
Pages (from-to)3670-3686
Number of pages17
JournalIEEE Transactions on Mobile Computing
Issue number10
Early online date26 Feb 2021
Publication statusPublished - Oct 2022


Dive into the research topics of 'MDLdroidLite: a release-and-inhibit control approach to resource-efficient deep neural networks on mobile devices'. Together they form a unique fingerprint.

Cite this