[edit]
EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search
Proceedings of The 14th Asian Conference on Machine
Learning, PMLR 189:1261-1276, 2023.
Abstract
Neural Architecture Search (NAS) has been widely
applied to automatic neural architecture
design. Traditional NAS methods often evaluate a
large number of architectures, leading to expensive
computation overhead. To speed-up architecture
search, recent NAS methods try to employ network
estimation strategies for guidance of promising
architecture selection. In this paper, we have
proposed an efficient evolutionary algorithm for
NAS, which adapts the most advanced proxy of
synthetic signal bases for architecture
estimation. Extensive experiments show that our
method outperforms state-of-the-art NAS methods, on
NAS-Bench-101 search space and NAS-Bench-201 search
space (CIFAR-10, CIFAR-100 and
ImageNet16-120). Compared with existing works, our
method could identify better architectures with
greatly reduced search time.