Withdraw
Loading…
Slimmable neural networks for edge devices
Yu, Jiahui
Loading…
Permalink
https://hdl.handle.net/2142/105133
Description
- Title
- Slimmable neural networks for edge devices
- Author(s)
- Yu, Jiahui
- Issue Date
- 2019-02-15
- Director of Research (if dissertation) or Advisor (if thesis)
- Huang, Thomas S.
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- M.S.
- Degree Level
- Thesis
- Keyword(s)
- slimmable networks
- Abstract
- While methods based on deep learning have witnessed major breakthroughs in machine perception and generative modeling, the problem of how to run neural networks within latency budget for edge devices remains unsolved. This thesis presents a new approach to train a single neural network executable at arbitrary widths for instant and adaptive accuracy-efficiency trade-offs at runtime. First a simple and general method is presented to train a single neural network executable at different widths (number of channels in a layer). The width can be chosen from a predefined widths set to adaptively optimize accuracy-efficiency trade-offs at runtime. Instead of training individual networks with different width configurations, we train a shared network with switchable batch normalization. At runtime, the network can adjust its width on the fly according to on-device benchmarks and resource constraints, rather than downloading and offloading different models. Our trained networks, named slimmable neural networks, achieve ImageNet classification accuracy similar to (and in many cases better than) that of individually trained models of MobileNet v1, MobileNet v2, ShuffleNet and ResNet-50 at different widths. We also demonstrate better performance of slimmable models compared with individual ones across a wide range of applications including COCO bounding-box object detection, instance segmentation and person keypoint detection without tuning hyper-parameters. We visualize and discuss the learned features of slimmable networks. Further, we propose a systematic approach to train universally slimmable networks (US-Nets), extending slimmable networks to execute at arbitrary width, and generalizing to networks both with and without batch normalization layers. In addition, we propose two improved training techniques for US-Nets, named the sandwich rule and the inplace distillation, to enhance training process and boost testing accuracy. We show improved performance of universally slimmable MobileNet v1 and MobileNet v2 on ImageNet classification task, compared with individually trained ones and 4-switch slimmable network baselines. We also evaluate the proposed US-Nets and improved training techniques on tasks of image super-resolution and deep reinforcement learning. Extensive ablation experiments on these representative tasks demonstrate the effectiveness of our proposed methods. Our discovery opens up the possibility to directly evaluate a FLOPs-Accuracy spectrum of network architectures. Finally, we demonstrate an application to search for channel number configurations based on proposed slimmable networks.
- Graduation Semester
- 2019-05
- Type of Resource
- text
- Permalink
- http://hdl.handle.net/2142/105133
- Copyright and License Information
- Copyright 2019 Jiahui Yu
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisDissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…