Download PDFOpen PDF in browser

Neural Architecture Search with Structure Complexity Control

EasyChair Preprint 7973

12 pagesDate: May 21, 2022

Abstract

The paper investigates the problem of deep learning model selection. The authors propose a method of a neural architecture search with respect to its desired complexity. As a complexity, we consider a number of parameters that use selected architecture. The method is based on a differential architecture search algorithm (DARTS). Instead of optimizing structural parameters of the architecture, we consider them as a function depending on the complexity parameter. To evaluate the quality of the proposed algorithm, we conduct experiments on the Fashion-MNIST and CIFAR-10 datasets and compare the resulting architecture with DARTS method.

Keyphrases: deep learning, differentiable architecture search, hypernetwork, model complexity control, neural networks

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:7973,
  author    = {Konstantin Yakovlev and Olga Grebenkova and Oleg Bakhteev and Vadim Strijov},
  title     = {Neural Architecture Search with Structure Complexity Control},
  howpublished = {EasyChair Preprint 7973},
  year      = {EasyChair, 2022}}
Download PDFOpen PDF in browser