Local Stochastic Differentiable Architecture Search for Memetic Neuroevolution Algorithms

Even the most efficient approaches to neural architecture search can be very computationally expensive, which leaves little room for inefficiencies. Unfortunately, evolutionary approaches to neural architecture search (NAS) - neuroevolution - often suffer from these inefficiencies due in part to their massive, intractable search spaces. In the past several years, differentiable approaches to neural architecture search have garnered significant attention for their efficiency and performance. In this work, we suggest an approach to create a hybrid algorithm: a neuroevolutionary metaheuristic which employs a local differentiable stochastic neural architecture search during the fitness evaluation step. Our preliminary results demonstrate that this approach is successful in selecting recurrent neural network memory cells.