Learning-AI

LeGR: Filter Pruning via Learned Global Ranking

May 2019

tl;dr: Learn a global ranking metric (affine transformed L2 norm) for layers so pruning filters is just as easy as thresholding this measure. This paper provides a practical tool to provide multiple pruned networks on the Pareto-front in one shot.

Overall impression

Previous methods use costly methods to find the Pareto front of performance-resource tradeoff curve. Resource-Constrained filter pruning is the main topic of this study, and it performs better than Auto Model Compression (AMC) and MorphNet. Global filter ranking can also be achieved by first order Taylor approximation to the loss increase caused by pruning (sample data needed thus not a data-free method)

Key ideas

Technical details

Notes