Learning-AI

MnasNet: Platform-Aware Neural Architecture Search for Mobile

May 2019

tl;dr: Search the neighborhood of MobileNetV2.

Overall impression

One of the main challenge of NAS is its vast search space. This paper uses MobilenetsV2 as a starting point and significantly reduces the search space. M stands for mobile.

The algorithm can be seen as an evolution algorithm, just a glorified for loop.

The performance is overtaken by FBNet also published at CVPR 2019, which uses differentiable optimization method instead of training a controller.

Key ideas

Technical details

Notes