Deep Neural Networks Pruning

Deep Neural Networks Pruning

玮 李

玮 李

Hangzhou Shi, Zhejiang Sheng

This project aims to find a new way to prune the deep neural networks.

Artificial Intelligence

  • 0 Collaborators

  • 0 Followers

    Follow

Description

This field starts from the paper Yann Lecun, the optimal brain damage, 1989. He found that many parameters in neural networks are unimportant. Nowadays, with the network going more deeper, they require more computation resources and parameters. For example, a 100-layer ResNet requires more than 11 Giga float-point-operations when inferencing an image 224X224. It has 1.7 Mega 32-bit float point numbers. Deploying these large models on resource-limited platforms, e.g., mobiles and embedded devices is a great challenge. So we need to prune the deep neural networks. Now there are many ways to prune. For example, find a metric to evaluate the importance of parameters or establish an optimization problem and solve it. My advisor and I try to find a new way to prune it.

Medium 4

玮 李. created project Deep Neural Networks Pruning

Medium 20b0201f b6a9 45ea b51d f98cb03d8680

Deep Neural Networks Pruning

This field starts from the paper Yann Lecun, the optimal brain damage, 1989. He found that many parameters in neural networks are unimportant. Nowadays, with the network going more deeper, they require more computation resources and parameters. For example, a 100-layer ResNet requires more than 11 Giga float-point-operations when inferencing an image 224X224. It has 1.7 Mega 32-bit float point numbers. Deploying these large models on resource-limited platforms, e.g., mobiles and embedded devices is a great challenge. So we need to prune the deep neural networks. Now there are many ways to prune. For example, find a metric to evaluate the importance of parameters or establish an optimization problem and solve it. My advisor and I try to find a new way to prune it.

No users to show at the moment.

No users to show at the moment.