Deep Neural Networks Pruning
- 0 Collaborators
This project aims to find a new way to prune the deep neural networks. ...learn more
Overview / Usage
This field starts from the paper Yann Lecun, the optimal brain damage, 1989. He found that many parameters in neural networks are unimportant. Nowadays, with the network going more deeper, they require more computation resources and parameters. For example, a 100-layer ResNet requires more than 11 Giga float-point-operations when inferencing an image 224X224. It has 1.7 Mega 32-bit float point numbers. Deploying these large models on resource-limited platforms, e.g., mobiles and embedded devices is a great challenge. So we need to prune the deep neural networks.
Now there are many ways to prune. For example, find a metric to evaluate the importance of parameters or establish an optimization problem and solve it. My advisor and I try to find a new way to prune it.