Learning-AI

Searching for MobileNetV3

May 2019

tl;dr: Combination of automatic search (NAS and NetAdapt) with novel architecture advances (SENet, swish activation function, hard sigmoid) to search for MobileNetv3.

Overall impression

Improved upon MobileNets v2 (with inverted residuals and linear bottleneck) and MNasNet (NAS-optimized MobileNetsv2). The paper itself does not dwell too much on NAS, but instead reported the searched result, a deterministic model, similar to MNasNet. In particular, MobileNetV3-large uses MnasNet as baseline and uses net-adapt to finetune.

The idea of NetAdapt seems to be practical. It is complementary to NAS and finetunes the number of layers, after NAS finalizes the number of blocks.

The most useful takeaway is the MobileNetV3-Large and MobileNetV3-Small backbones. See PyTorch Reimplementation on github.

MobileNet series talks about how to be fast yet accurate. EfficientNet talks about how to scale up MobieNet to achieve SOTA accuracy yet efficient.

Key ideas

Technical details

Notes