Director of Research (if dissertation) or Advisor (if thesis)
Kindratenko, Volodymyr
Department of Study
Electrical & Computer Eng
Discipline
Electrical & Computer Engr
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Date of Ingest
2020-08-26T23:58:46Z
Keyword(s)
Pruning
DNN
Abstract
In recent years, deep neural networks have achieved remarkable results in various artificial intelligence tasks such as image recognition and machine language translation. Although deep neural networks achieve state-of-the-art accuracy for various tasks, the high accuracy comes with high computational complexity and large network size which leads to over-parameterization. The computational complexity and large size of deep neural networks limit their applications for low-power and mobile platforms. Techniques such as pruning, quantization and binarization have been developed to reduce network parameter size. Pruning is the one of well-studied methods to solve network over-parameterization and reduce computational complexity.
A typical pruning process includes three stages: train the network, remove redundant weights according to certain criteria, and fine-tune the reduced network. However, the three-stage method is time-consuming and frequently cannot fully compensate for the pruned neurons which are actually important. In this thesis, we propose a new algorithm which merges the removing stage and fine-tuning stage into the training stage. The new algorithm reduces time complexity of the pruning process and makes incorrectly pruned neurons recoverable without significant loss of accuracy. We also explore the effects of different pruning criteria calculation methods, weight updating methods and pruning rate changing methods on pruning algorithm performance.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.